Apr 17 09:08:39.306135 ip-10-0-130-147 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 09:08:39.306149 ip-10-0-130-147 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 09:08:39.306158 ip-10-0-130-147 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 09:08:39.306470 ip-10-0-130-147 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 09:08:49.466212 ip-10-0-130-147 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 09:08:49.466227 ip-10-0-130-147 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot fe04019d713d4889b1c0191d8386eaa5 -- Apr 17 09:11:14.971158 ip-10-0-130-147 systemd[1]: Starting Kubernetes Kubelet... Apr 17 09:11:15.421582 ip-10-0-130-147 kubenswrapper[2578]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 09:11:15.421582 ip-10-0-130-147 kubenswrapper[2578]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 09:11:15.421582 ip-10-0-130-147 kubenswrapper[2578]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 09:11:15.421582 ip-10-0-130-147 kubenswrapper[2578]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 09:11:15.421582 ip-10-0-130-147 kubenswrapper[2578]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 09:11:15.423424 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.423337 2578 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 09:11:15.427864 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.427847 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 09:11:15.427864 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.427863 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 09:11:15.427933 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.427868 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 09:11:15.427933 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.427871 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 09:11:15.427933 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.427875 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 09:11:15.427933 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.427880 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 09:11:15.427933 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.427883 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 09:11:15.427933 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.427887 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 17 09:11:15.427933 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.427890 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 09:11:15.427933 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.427893 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 09:11:15.427933 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.427896 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 09:11:15.427933 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.427898 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 09:11:15.427933 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.427902 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 09:11:15.427933 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.427905 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 09:11:15.427933 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.427907 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 09:11:15.427933 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.427912 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 09:11:15.427933 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.427917 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 09:11:15.427933 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.427920 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 09:11:15.427933 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.427923 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 09:11:15.427933 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.427926 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 09:11:15.427933 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.427936 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 09:11:15.427933 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.427940 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 09:11:15.428401 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.427943 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 09:11:15.428401 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.427947 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 09:11:15.428401 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.427949 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 09:11:15.428401 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.427952 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 09:11:15.428401 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.427955 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 09:11:15.428401 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.427958 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 09:11:15.428401 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.427960 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 09:11:15.428401 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.427963 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 09:11:15.428401 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.427965 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 09:11:15.428401 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.427968 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 09:11:15.428401 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.427971 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 09:11:15.428401 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.427973 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 09:11:15.428401 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.427979 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 09:11:15.428401 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.427983 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 09:11:15.428401 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.427985 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 09:11:15.428401 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.427988 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 09:11:15.428401 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.427990 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 09:11:15.428401 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.427993 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 09:11:15.428401 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.427996 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 09:11:15.428864 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.427998 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 09:11:15.428864 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428001 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 09:11:15.428864 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428003 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 09:11:15.428864 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428006 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 09:11:15.428864 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428009 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 09:11:15.428864 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428011 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 09:11:15.428864 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428015 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 09:11:15.428864 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428018 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 09:11:15.428864 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428020 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 09:11:15.428864 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428023 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 09:11:15.428864 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428025 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 09:11:15.428864 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428028 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 09:11:15.428864 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428030 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 09:11:15.428864 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428033 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 09:11:15.428864 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428036 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 09:11:15.428864 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428039 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 09:11:15.428864 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428042 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 09:11:15.428864 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428044 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 09:11:15.428864 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428047 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 09:11:15.428864 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428049 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 09:11:15.429375 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428052 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 09:11:15.429375 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428054 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 09:11:15.429375 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428057 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 09:11:15.429375 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428060 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 09:11:15.429375 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428063 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 09:11:15.429375 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428065 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 09:11:15.429375 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428068 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 09:11:15.429375 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428070 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 09:11:15.429375 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428073 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 09:11:15.429375 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428076 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 09:11:15.429375 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428078 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 09:11:15.429375 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428081 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 09:11:15.429375 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428084 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 09:11:15.429375 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428088 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 09:11:15.429375 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428090 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 09:11:15.429375 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428097 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 09:11:15.429375 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428100 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 09:11:15.429375 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428103 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 09:11:15.429375 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428106 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 09:11:15.429375 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428109 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 09:11:15.429856 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428112 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 09:11:15.429856 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428114 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 09:11:15.429856 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428117 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 09:11:15.429856 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428119 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 09:11:15.429856 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428122 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 09:11:15.429856 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428509 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 09:11:15.429856 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428515 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 09:11:15.429856 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428518 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 09:11:15.429856 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428521 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 09:11:15.429856 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428523 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 09:11:15.429856 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428526 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 09:11:15.429856 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428528 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 09:11:15.429856 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428531 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 09:11:15.429856 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428534 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 09:11:15.429856 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428536 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 09:11:15.429856 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428539 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 09:11:15.429856 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428541 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 09:11:15.429856 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428544 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 09:11:15.429856 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428547 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 09:11:15.429856 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428550 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 09:11:15.430339 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428552 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 09:11:15.430339 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428555 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 09:11:15.430339 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428557 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 09:11:15.430339 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428560 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 09:11:15.430339 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428562 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 09:11:15.430339 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428567 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 09:11:15.430339 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428571 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 09:11:15.430339 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428574 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 09:11:15.430339 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428577 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 09:11:15.430339 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428580 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 09:11:15.430339 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428583 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 09:11:15.430339 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428586 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 09:11:15.430339 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428589 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 09:11:15.430339 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428592 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 09:11:15.430339 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428595 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 09:11:15.430339 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428597 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 09:11:15.430339 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428600 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 09:11:15.430339 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428603 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 09:11:15.430339 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428606 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 09:11:15.430807 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428608 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 09:11:15.430807 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428612 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 09:11:15.430807 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428614 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 09:11:15.430807 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428617 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 09:11:15.430807 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428619 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 09:11:15.430807 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428622 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 09:11:15.430807 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428624 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 09:11:15.430807 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428627 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 09:11:15.430807 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428629 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 09:11:15.430807 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428631 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 09:11:15.430807 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428634 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 09:11:15.430807 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428637 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 09:11:15.430807 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428639 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 09:11:15.430807 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428642 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 09:11:15.430807 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428644 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 09:11:15.430807 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428647 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 09:11:15.430807 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428651 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 09:11:15.430807 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428654 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 09:11:15.430807 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428657 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 09:11:15.431285 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428659 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 09:11:15.431285 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428662 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 09:11:15.431285 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428664 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 09:11:15.431285 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428667 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 09:11:15.431285 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428669 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 09:11:15.431285 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428672 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 09:11:15.431285 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428675 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 09:11:15.431285 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428678 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 09:11:15.431285 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428681 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 09:11:15.431285 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428683 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 17 09:11:15.431285 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428686 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 09:11:15.431285 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428689 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 09:11:15.431285 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428692 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 09:11:15.431285 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428694 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 09:11:15.431285 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428697 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 09:11:15.431285 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428699 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 09:11:15.431285 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428702 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 09:11:15.431285 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428704 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 09:11:15.431285 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428707 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 09:11:15.431285 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428709 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 09:11:15.431768 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428712 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 09:11:15.431768 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428714 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 09:11:15.431768 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428717 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 09:11:15.431768 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428719 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 09:11:15.431768 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428722 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 09:11:15.431768 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428724 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 09:11:15.431768 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428727 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 09:11:15.431768 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428729 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 09:11:15.431768 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428732 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 09:11:15.431768 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428734 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 09:11:15.431768 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428738 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 09:11:15.431768 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428741 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 09:11:15.431768 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.428743 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 09:11:15.431768 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430062 2578 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 09:11:15.431768 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430072 2578 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 09:11:15.431768 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430080 2578 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 09:11:15.431768 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430084 2578 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 09:11:15.431768 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430089 2578 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 09:11:15.431768 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430093 2578 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 09:11:15.431768 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430098 2578 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 09:11:15.431768 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430103 2578 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 09:11:15.432308 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430106 2578 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 09:11:15.432308 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430110 2578 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 09:11:15.432308 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430114 2578 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 09:11:15.432308 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430117 2578 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 09:11:15.432308 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430121 2578 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 09:11:15.432308 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430124 2578 flags.go:64] FLAG: --cgroup-root="" Apr 17 09:11:15.432308 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430127 2578 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 09:11:15.432308 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430130 2578 flags.go:64] FLAG: --client-ca-file="" Apr 17 09:11:15.432308 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430133 2578 flags.go:64] FLAG: --cloud-config="" Apr 17 09:11:15.432308 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430136 2578 flags.go:64] FLAG: --cloud-provider="external" Apr 17 09:11:15.432308 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430139 2578 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 09:11:15.432308 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430143 2578 flags.go:64] FLAG: --cluster-domain="" Apr 17 09:11:15.432308 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430146 2578 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 09:11:15.432308 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430150 2578 flags.go:64] FLAG: --config-dir="" Apr 17 09:11:15.432308 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430152 2578 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 09:11:15.432308 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430156 2578 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 09:11:15.432308 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430160 2578 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 09:11:15.432308 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430163 2578 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 09:11:15.432308 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430166 2578 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 09:11:15.432308 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430169 2578 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 09:11:15.432308 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430172 2578 flags.go:64] FLAG: --contention-profiling="false" Apr 17 09:11:15.432308 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430175 2578 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 09:11:15.432308 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430178 2578 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 09:11:15.432308 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430181 2578 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 09:11:15.432308 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430184 2578 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 09:11:15.432924 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430190 2578 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 09:11:15.432924 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430194 2578 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 09:11:15.432924 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430197 2578 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 09:11:15.432924 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430200 2578 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 09:11:15.432924 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430204 2578 flags.go:64] FLAG: --enable-server="true" Apr 17 09:11:15.432924 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430207 2578 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 09:11:15.432924 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430212 2578 flags.go:64] FLAG: --event-burst="100" Apr 17 09:11:15.432924 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430215 2578 flags.go:64] FLAG: --event-qps="50" Apr 17 09:11:15.432924 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430219 2578 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 09:11:15.432924 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430221 2578 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 09:11:15.432924 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430224 2578 flags.go:64] FLAG: --eviction-hard="" Apr 17 09:11:15.432924 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430228 2578 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 09:11:15.432924 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430231 2578 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 09:11:15.432924 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430234 2578 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 09:11:15.432924 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430237 2578 flags.go:64] FLAG: --eviction-soft="" Apr 17 09:11:15.432924 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430240 2578 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 09:11:15.432924 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430243 2578 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 09:11:15.432924 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430246 2578 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 09:11:15.432924 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430249 2578 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 09:11:15.432924 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430252 2578 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 09:11:15.432924 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430255 2578 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 09:11:15.432924 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430258 2578 flags.go:64] FLAG: --feature-gates="" Apr 17 09:11:15.432924 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430262 2578 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 09:11:15.432924 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430265 2578 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 09:11:15.432924 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430268 2578 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 09:11:15.433532 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430271 2578 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 09:11:15.433532 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430274 2578 flags.go:64] FLAG: --healthz-port="10248" Apr 17 09:11:15.433532 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430277 2578 flags.go:64] FLAG: --help="false" Apr 17 09:11:15.433532 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430280 2578 flags.go:64] FLAG: --hostname-override="ip-10-0-130-147.ec2.internal" Apr 17 09:11:15.433532 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430283 2578 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 09:11:15.433532 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430286 2578 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 09:11:15.433532 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430289 2578 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 09:11:15.433532 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430292 2578 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 09:11:15.433532 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430295 2578 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 09:11:15.433532 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430298 2578 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 09:11:15.433532 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430301 2578 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 09:11:15.433532 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430304 2578 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 09:11:15.433532 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430307 2578 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 09:11:15.433532 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430310 2578 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 09:11:15.433532 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430314 2578 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 09:11:15.433532 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430317 2578 flags.go:64] FLAG: --kube-reserved="" Apr 17 09:11:15.433532 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430320 2578 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 09:11:15.433532 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430323 2578 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 09:11:15.433532 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430327 2578 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 09:11:15.433532 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430329 2578 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 09:11:15.433532 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430332 2578 flags.go:64] FLAG: --lock-file="" Apr 17 09:11:15.433532 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430335 2578 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 09:11:15.433532 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430338 2578 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 09:11:15.433532 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430341 2578 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 09:11:15.434164 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430346 2578 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 09:11:15.434164 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430349 2578 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 09:11:15.434164 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430352 2578 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 09:11:15.434164 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430355 2578 flags.go:64] FLAG: --logging-format="text" Apr 17 09:11:15.434164 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430358 2578 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 09:11:15.434164 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430361 2578 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 09:11:15.434164 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430364 2578 flags.go:64] FLAG: --manifest-url="" Apr 17 09:11:15.434164 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430367 2578 flags.go:64] FLAG: --manifest-url-header="" Apr 17 09:11:15.434164 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430372 2578 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 09:11:15.434164 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430375 2578 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 09:11:15.434164 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430379 2578 flags.go:64] FLAG: --max-pods="110" Apr 17 09:11:15.434164 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430382 2578 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 09:11:15.434164 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430385 2578 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 09:11:15.434164 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430388 2578 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 09:11:15.434164 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430391 2578 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 09:11:15.434164 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430394 2578 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 09:11:15.434164 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430397 2578 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 09:11:15.434164 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430400 2578 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 09:11:15.434164 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430408 2578 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 09:11:15.434164 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430412 2578 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 09:11:15.434164 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430415 2578 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 09:11:15.434164 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430418 2578 flags.go:64] FLAG: --pod-cidr="" Apr 17 09:11:15.434164 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430421 2578 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 09:11:15.434713 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430426 2578 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 09:11:15.434713 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430430 2578 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 09:11:15.434713 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430433 2578 flags.go:64] FLAG: --pods-per-core="0" Apr 17 09:11:15.434713 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430436 2578 flags.go:64] FLAG: --port="10250" Apr 17 09:11:15.434713 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430439 2578 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 09:11:15.434713 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430442 2578 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-02cf5c59749e85325" Apr 17 09:11:15.434713 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430445 2578 flags.go:64] FLAG: --qos-reserved="" Apr 17 09:11:15.434713 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430448 2578 flags.go:64] FLAG: --read-only-port="10255" Apr 17 09:11:15.434713 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430451 2578 flags.go:64] FLAG: --register-node="true" Apr 17 09:11:15.434713 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430454 2578 flags.go:64] FLAG: --register-schedulable="true" Apr 17 09:11:15.434713 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430457 2578 flags.go:64] FLAG: --register-with-taints="" Apr 17 09:11:15.434713 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430461 2578 flags.go:64] FLAG: --registry-burst="10" Apr 17 09:11:15.434713 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430463 2578 flags.go:64] FLAG: --registry-qps="5" Apr 17 09:11:15.434713 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430466 2578 flags.go:64] FLAG: --reserved-cpus="" Apr 17 09:11:15.434713 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430469 2578 flags.go:64] FLAG: --reserved-memory="" Apr 17 09:11:15.434713 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430473 2578 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 09:11:15.434713 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430477 2578 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 09:11:15.434713 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430479 2578 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 09:11:15.434713 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430482 2578 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 09:11:15.434713 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430485 2578 flags.go:64] FLAG: --runonce="false" Apr 17 09:11:15.434713 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430488 2578 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 09:11:15.434713 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430491 2578 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 09:11:15.434713 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430494 2578 flags.go:64] FLAG: --seccomp-default="false" Apr 17 09:11:15.434713 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430497 2578 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 09:11:15.434713 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430500 2578 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 09:11:15.434713 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430503 2578 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 09:11:15.435336 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430506 2578 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 09:11:15.435336 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430509 2578 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 09:11:15.435336 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430511 2578 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 09:11:15.435336 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430515 2578 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 09:11:15.435336 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430518 2578 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 09:11:15.435336 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430521 2578 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 09:11:15.435336 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430524 2578 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 09:11:15.435336 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430527 2578 flags.go:64] FLAG: --system-cgroups="" Apr 17 09:11:15.435336 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430530 2578 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 09:11:15.435336 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430535 2578 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 09:11:15.435336 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430538 2578 flags.go:64] FLAG: --tls-cert-file="" Apr 17 09:11:15.435336 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430541 2578 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 09:11:15.435336 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430545 2578 flags.go:64] FLAG: --tls-min-version="" Apr 17 09:11:15.435336 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430548 2578 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 09:11:15.435336 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430550 2578 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 09:11:15.435336 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430553 2578 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 09:11:15.435336 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430556 2578 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 09:11:15.435336 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430559 2578 flags.go:64] FLAG: --v="2" Apr 17 09:11:15.435336 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430563 2578 flags.go:64] FLAG: --version="false" Apr 17 09:11:15.435336 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430567 2578 flags.go:64] FLAG: --vmodule="" Apr 17 09:11:15.435336 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430571 2578 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 09:11:15.435336 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.430575 2578 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 09:11:15.435336 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430675 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 09:11:15.435336 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430679 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 09:11:15.435945 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430682 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 09:11:15.435945 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430685 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 09:11:15.435945 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430688 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 09:11:15.435945 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430691 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 09:11:15.435945 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430693 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 09:11:15.435945 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430696 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 09:11:15.435945 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430699 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 09:11:15.435945 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430702 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 09:11:15.435945 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430705 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 09:11:15.435945 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430707 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 09:11:15.435945 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430709 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 09:11:15.435945 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430717 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 09:11:15.435945 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430720 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 09:11:15.435945 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430723 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 09:11:15.435945 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430726 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 09:11:15.435945 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430729 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 09:11:15.435945 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430732 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 09:11:15.435945 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430735 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 09:11:15.435945 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430737 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 09:11:15.436426 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430740 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 09:11:15.436426 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430743 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 09:11:15.436426 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430745 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 09:11:15.436426 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430748 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 09:11:15.436426 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430750 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 09:11:15.436426 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430753 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 09:11:15.436426 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430756 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 09:11:15.436426 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430760 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 09:11:15.436426 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430763 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 09:11:15.436426 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430766 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 09:11:15.436426 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430769 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 09:11:15.436426 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430773 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 09:11:15.436426 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430776 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 09:11:15.436426 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430779 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 09:11:15.436426 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430782 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 09:11:15.436426 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430784 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 09:11:15.436426 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430787 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 09:11:15.436426 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430790 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 09:11:15.436426 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430793 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 09:11:15.436911 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430795 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 09:11:15.436911 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430798 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 09:11:15.436911 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430801 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 09:11:15.436911 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430804 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 09:11:15.436911 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430807 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 09:11:15.436911 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430812 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 09:11:15.436911 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430815 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 09:11:15.436911 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430818 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 09:11:15.436911 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430821 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 09:11:15.436911 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430823 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 17 09:11:15.436911 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430826 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 09:11:15.436911 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430829 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 09:11:15.436911 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430832 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 09:11:15.436911 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430845 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 09:11:15.436911 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430848 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 09:11:15.436911 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430851 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 09:11:15.436911 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430853 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 09:11:15.436911 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430856 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 09:11:15.436911 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430859 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 09:11:15.436911 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430861 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 09:11:15.436911 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430864 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 09:11:15.437722 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430867 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 09:11:15.437722 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430869 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 09:11:15.437722 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430872 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 09:11:15.437722 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430874 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 09:11:15.437722 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430877 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 09:11:15.437722 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430880 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 09:11:15.437722 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430882 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 09:11:15.437722 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430885 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 09:11:15.437722 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430887 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 09:11:15.437722 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430890 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 09:11:15.437722 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430893 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 09:11:15.437722 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430895 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 09:11:15.437722 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430898 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 09:11:15.437722 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430900 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 09:11:15.437722 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430903 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 09:11:15.437722 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430905 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 09:11:15.437722 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430911 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 09:11:15.437722 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430914 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 09:11:15.437722 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430916 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 09:11:15.437722 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430919 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 09:11:15.438560 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430922 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 09:11:15.438560 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430924 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 09:11:15.438560 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430928 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 09:11:15.438560 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430932 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 09:11:15.438560 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.430935 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 09:11:15.438560 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.431661 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 09:11:15.439710 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.439682 2578 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 09:11:15.439710 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.439710 2578 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 09:11:15.439895 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.439792 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 09:11:15.439895 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.439800 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 09:11:15.439895 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.439805 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 09:11:15.439895 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.439810 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 09:11:15.439895 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.439815 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 09:11:15.439895 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.439820 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 09:11:15.439895 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.439824 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 09:11:15.439895 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.439829 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 09:11:15.439895 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.439849 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 09:11:15.439895 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.439855 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 09:11:15.439895 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.439859 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 09:11:15.439895 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.439864 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 09:11:15.439895 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.439868 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 09:11:15.439895 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.439874 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 09:11:15.439895 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.439878 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 09:11:15.439895 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.439883 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 17 09:11:15.439895 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.439887 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 09:11:15.439895 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.439891 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 09:11:15.439895 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.439896 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 09:11:15.439895 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.439900 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 09:11:15.440765 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.439904 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 09:11:15.440765 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.439910 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 09:11:15.440765 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.439919 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 09:11:15.440765 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.439924 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 09:11:15.440765 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.439929 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 09:11:15.440765 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.439934 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 09:11:15.440765 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.439938 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 09:11:15.440765 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.439942 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 09:11:15.440765 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.439949 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 09:11:15.440765 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.439954 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 09:11:15.440765 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.439959 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 09:11:15.440765 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.439963 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 09:11:15.440765 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.439968 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 09:11:15.440765 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.439972 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 09:11:15.440765 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.439976 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 09:11:15.440765 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.439981 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 09:11:15.440765 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.439985 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 09:11:15.440765 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.439989 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 09:11:15.440765 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.439993 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 09:11:15.441323 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.439997 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 09:11:15.441323 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440001 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 09:11:15.441323 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440005 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 09:11:15.441323 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440009 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 09:11:15.441323 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440014 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 09:11:15.441323 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440018 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 09:11:15.441323 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440022 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 09:11:15.441323 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440027 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 09:11:15.441323 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440031 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 09:11:15.441323 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440035 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 09:11:15.441323 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440041 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 09:11:15.441323 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440045 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 09:11:15.441323 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440050 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 09:11:15.441323 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440054 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 09:11:15.441323 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440059 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 09:11:15.441323 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440063 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 09:11:15.441323 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440067 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 09:11:15.441323 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440071 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 09:11:15.441323 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440075 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 09:11:15.441323 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440080 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 09:11:15.441920 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440084 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 09:11:15.441920 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440088 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 09:11:15.441920 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440092 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 09:11:15.441920 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440096 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 09:11:15.441920 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440100 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 09:11:15.441920 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440104 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 09:11:15.441920 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440109 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 09:11:15.441920 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440113 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 09:11:15.441920 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440117 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 09:11:15.441920 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440121 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 09:11:15.441920 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440126 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 09:11:15.441920 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440130 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 09:11:15.441920 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440135 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 09:11:15.441920 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440139 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 09:11:15.441920 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440143 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 09:11:15.441920 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440147 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 09:11:15.441920 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440152 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 09:11:15.441920 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440158 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 09:11:15.441920 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440162 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 09:11:15.441920 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440166 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 09:11:15.442581 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440171 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 09:11:15.442581 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440175 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 09:11:15.442581 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440180 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 09:11:15.442581 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440184 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 09:11:15.442581 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440188 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 09:11:15.442581 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440192 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 09:11:15.442581 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440197 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 09:11:15.442581 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.440206 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 09:11:15.442581 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440367 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 09:11:15.442581 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440375 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 09:11:15.442581 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440381 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 09:11:15.442581 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440387 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 09:11:15.442581 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440392 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 09:11:15.442581 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440396 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 09:11:15.442581 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440400 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 09:11:15.443084 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440404 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 09:11:15.443084 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440408 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 09:11:15.443084 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440413 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 09:11:15.443084 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440417 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 09:11:15.443084 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440421 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 09:11:15.443084 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440425 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 09:11:15.443084 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440429 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 09:11:15.443084 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440434 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 09:11:15.443084 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440438 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 09:11:15.443084 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440442 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 09:11:15.443084 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440446 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 09:11:15.443084 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440450 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 09:11:15.443084 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440454 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 09:11:15.443084 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440458 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 09:11:15.443084 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440462 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 09:11:15.443084 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440469 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 09:11:15.443084 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440473 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 09:11:15.443084 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440477 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 09:11:15.443084 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440481 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 09:11:15.443084 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440488 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 09:11:15.443608 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440493 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 09:11:15.443608 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440499 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 09:11:15.443608 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440503 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 09:11:15.443608 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440508 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 09:11:15.443608 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440512 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 17 09:11:15.443608 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440517 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 09:11:15.443608 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440521 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 09:11:15.443608 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440526 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 09:11:15.443608 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440530 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 09:11:15.443608 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440535 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 09:11:15.443608 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440539 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 09:11:15.443608 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440544 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 09:11:15.443608 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440548 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 09:11:15.443608 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440552 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 09:11:15.443608 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440556 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 09:11:15.443608 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440560 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 09:11:15.443608 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440564 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 09:11:15.443608 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440569 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 09:11:15.443608 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440573 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 09:11:15.443608 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440578 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 09:11:15.444328 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440582 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 09:11:15.444328 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440587 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 09:11:15.444328 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440591 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 09:11:15.444328 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440595 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 09:11:15.444328 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440599 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 09:11:15.444328 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440603 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 09:11:15.444328 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440608 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 09:11:15.444328 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440612 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 09:11:15.444328 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440618 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 09:11:15.444328 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440622 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 09:11:15.444328 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440626 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 09:11:15.444328 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440630 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 09:11:15.444328 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440635 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 09:11:15.444328 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440639 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 09:11:15.444328 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440643 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 09:11:15.444328 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440648 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 09:11:15.444328 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440652 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 09:11:15.444328 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440656 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 09:11:15.444328 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440660 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 09:11:15.444328 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440665 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 09:11:15.445064 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440669 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 09:11:15.445064 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440673 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 09:11:15.445064 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440677 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 09:11:15.445064 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440681 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 09:11:15.445064 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440685 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 09:11:15.445064 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440689 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 09:11:15.445064 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440694 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 09:11:15.445064 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440698 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 09:11:15.445064 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440702 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 09:11:15.445064 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440706 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 09:11:15.445064 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440711 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 09:11:15.445064 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440715 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 09:11:15.445064 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440719 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 09:11:15.445064 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440723 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 09:11:15.445064 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440726 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 09:11:15.445064 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440731 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 09:11:15.445064 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440735 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 09:11:15.445064 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440739 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 09:11:15.445064 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:15.440743 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 09:11:15.445627 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.440752 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 09:11:15.445627 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.441533 2578 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 09:11:15.445627 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.445477 2578 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 09:11:15.446433 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.446418 2578 server.go:1019] "Starting client certificate rotation" Apr 17 09:11:15.446533 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.446514 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 09:11:15.446570 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.446558 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 09:11:15.470287 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.470259 2578 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 09:11:15.472715 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.472693 2578 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 09:11:15.483861 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.483831 2578 log.go:25] "Validated CRI v1 runtime API" Apr 17 09:11:15.489228 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.489212 2578 log.go:25] "Validated CRI v1 image API" Apr 17 09:11:15.491301 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.491285 2578 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 09:11:15.497123 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.497100 2578 fs.go:135] Filesystem UUIDs: map[75d629e9-9983-4201-acf6-4a176156418f:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 8f8aba8f-abcc-42b8-bf8e-3cb83715e127:/dev/nvme0n1p3] Apr 17 09:11:15.497201 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.497120 2578 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 09:11:15.501713 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.501694 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 09:11:15.502037 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.501929 2578 manager.go:217] Machine: {Timestamp:2026-04-17 09:11:15.500677644 +0000 UTC m=+0.402735015 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099526 MemoryCapacity:33164484608 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2e206f4967dfbe5081eaa708762b22 SystemUUID:ec2e206f-4967-dfbe-5081-eaa708762b22 BootID:fe04019d-713d-4889-b1c0-191d8386eaa5 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:0f:60:84:60:ad Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:0f:60:84:60:ad Speed:0 Mtu:9001} {Name:ovs-system MacAddress:46:79:f2:d4:15:f3 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164484608 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 09:11:15.502037 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.502030 2578 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 09:11:15.502158 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.502110 2578 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 09:11:15.505192 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.505165 2578 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 09:11:15.505325 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.505194 2578 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-130-147.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 09:11:15.505370 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.505334 2578 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 09:11:15.505370 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.505343 2578 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 09:11:15.505370 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.505356 2578 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 09:11:15.505450 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.505373 2578 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 09:11:15.507532 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.507521 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 17 09:11:15.507641 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.507632 2578 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 09:11:15.510671 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.510660 2578 kubelet.go:491] "Attempting to sync node with API server" Apr 17 09:11:15.510726 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.510675 2578 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 09:11:15.511292 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.511283 2578 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 09:11:15.511338 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.511296 2578 kubelet.go:397] "Adding apiserver pod source" Apr 17 09:11:15.511338 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.511307 2578 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 09:11:15.512328 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.512316 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 09:11:15.512385 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.512335 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 09:11:15.515568 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.515550 2578 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 09:11:15.517264 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.517251 2578 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 09:11:15.518545 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.518534 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 09:11:15.518607 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.518551 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 09:11:15.518607 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.518558 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 09:11:15.518607 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.518563 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 09:11:15.518607 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.518570 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 09:11:15.518607 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.518577 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 09:11:15.518607 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.518586 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 09:11:15.518607 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.518604 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 09:11:15.518792 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.518612 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 09:11:15.518792 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.518618 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 09:11:15.518792 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.518635 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 09:11:15.518792 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.518740 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 09:11:15.520531 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.520520 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 09:11:15.520564 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.520532 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 09:11:15.521793 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.521773 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-ncpcw" Apr 17 09:11:15.523484 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.523464 2578 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-147.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 09:11:15.523550 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:15.523486 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-130-147.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 09:11:15.523550 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:15.523494 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 09:11:15.524090 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.524077 2578 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 09:11:15.524137 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.524116 2578 server.go:1295] "Started kubelet" Apr 17 09:11:15.524278 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.524216 2578 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 09:11:15.524808 ip-10-0-130-147 systemd[1]: Started Kubernetes Kubelet. Apr 17 09:11:15.524971 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.524920 2578 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 09:11:15.525045 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.525000 2578 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 09:11:15.525430 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.525346 2578 server.go:317] "Adding debug handlers to kubelet server" Apr 17 09:11:15.526536 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.526517 2578 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 09:11:15.528981 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.528965 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-ncpcw" Apr 17 09:11:15.529263 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:15.528104 2578 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-147.ec2.internal.18a719eb178d3598 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-147.ec2.internal,UID:ip-10-0-130-147.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-130-147.ec2.internal,},FirstTimestamp:2026-04-17 09:11:15.52408924 +0000 UTC m=+0.426146611,LastTimestamp:2026-04-17 09:11:15.52408924 +0000 UTC m=+0.426146611,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-147.ec2.internal,}" Apr 17 09:11:15.531350 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.531328 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 09:11:15.531857 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.531821 2578 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 09:11:15.532440 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.532427 2578 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 09:11:15.534468 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:15.533974 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-147.ec2.internal\" not found" Apr 17 09:11:15.534468 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.533659 2578 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 09:11:15.534468 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.534213 2578 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 09:11:15.534468 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.534463 2578 reconstruct.go:97] "Volume reconstruction finished" Apr 17 09:11:15.534468 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.534473 2578 reconciler.go:26] "Reconciler: start to sync state" Apr 17 09:11:15.534822 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.534798 2578 factory.go:55] Registering systemd factory Apr 17 09:11:15.534908 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.534859 2578 factory.go:223] Registration of the systemd container factory successfully Apr 17 09:11:15.536094 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.536073 2578 factory.go:153] Registering CRI-O factory Apr 17 09:11:15.536094 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.536096 2578 factory.go:223] Registration of the crio container factory successfully Apr 17 09:11:15.536240 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.536152 2578 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 09:11:15.536240 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.536178 2578 factory.go:103] Registering Raw factory Apr 17 09:11:15.536240 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.536196 2578 manager.go:1196] Started watching for new ooms in manager Apr 17 09:11:15.536719 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:15.536695 2578 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 09:11:15.536913 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.536897 2578 manager.go:319] Starting recovery of all containers Apr 17 09:11:15.544989 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.544969 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 09:11:15.547732 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.547583 2578 manager.go:324] Recovery completed Apr 17 09:11:15.547949 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:15.547931 2578 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-130-147.ec2.internal\" not found" node="ip-10-0-130-147.ec2.internal" Apr 17 09:11:15.552052 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.552039 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 09:11:15.554136 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.554121 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-147.ec2.internal" event="NodeHasSufficientMemory" Apr 17 09:11:15.554196 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.554147 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-147.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 09:11:15.554196 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.554157 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-147.ec2.internal" event="NodeHasSufficientPID" Apr 17 09:11:15.554619 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.554604 2578 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 09:11:15.554619 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.554616 2578 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 09:11:15.554702 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.554632 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 17 09:11:15.556754 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.556740 2578 policy_none.go:49] "None policy: Start" Apr 17 09:11:15.556806 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.556757 2578 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 09:11:15.556806 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.556768 2578 state_mem.go:35] "Initializing new in-memory state store" Apr 17 09:11:15.589383 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.589369 2578 manager.go:341] "Starting Device Plugin manager" Apr 17 09:11:15.598761 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:15.589407 2578 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 09:11:15.598761 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.589420 2578 server.go:85] "Starting device plugin registration server" Apr 17 09:11:15.598761 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.589656 2578 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 09:11:15.598761 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.589665 2578 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 09:11:15.598761 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.589748 2578 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 09:11:15.598761 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.589817 2578 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 09:11:15.598761 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.589826 2578 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 09:11:15.598761 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:15.590384 2578 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 09:11:15.598761 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:15.590416 2578 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-130-147.ec2.internal\" not found" Apr 17 09:11:15.653026 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.652983 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 09:11:15.654103 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.654078 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 09:11:15.654627 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.654607 2578 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 09:11:15.654724 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.654644 2578 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 09:11:15.654724 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.654654 2578 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 09:11:15.654724 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:15.654699 2578 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 09:11:15.657462 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.657446 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 09:11:15.690568 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.690499 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 09:11:15.691433 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.691416 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-147.ec2.internal" event="NodeHasSufficientMemory" Apr 17 09:11:15.691531 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.691451 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-147.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 09:11:15.691531 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.691467 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-147.ec2.internal" event="NodeHasSufficientPID" Apr 17 09:11:15.691531 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.691498 2578 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-147.ec2.internal" Apr 17 09:11:15.698177 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.698161 2578 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-130-147.ec2.internal" Apr 17 09:11:15.698255 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:15.698184 2578 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-130-147.ec2.internal\": node \"ip-10-0-130-147.ec2.internal\" not found" Apr 17 09:11:15.709202 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:15.709180 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-147.ec2.internal\" not found" Apr 17 09:11:15.755374 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.755344 2578 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-147.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-130-147.ec2.internal"] Apr 17 09:11:15.755504 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.755424 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 09:11:15.756807 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.756791 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-147.ec2.internal" event="NodeHasSufficientMemory" Apr 17 09:11:15.756897 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.756829 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-147.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 09:11:15.756897 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.756857 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-147.ec2.internal" event="NodeHasSufficientPID" Apr 17 09:11:15.758092 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.758081 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 09:11:15.758230 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.758215 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-147.ec2.internal" Apr 17 09:11:15.758274 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.758250 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 09:11:15.760417 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.760395 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-147.ec2.internal" event="NodeHasSufficientMemory" Apr 17 09:11:15.760515 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.760426 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-147.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 09:11:15.760515 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.760441 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-147.ec2.internal" event="NodeHasSufficientPID" Apr 17 09:11:15.760515 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.760453 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-147.ec2.internal" event="NodeHasSufficientMemory" Apr 17 09:11:15.760515 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.760474 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-147.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 09:11:15.760515 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.760487 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-147.ec2.internal" event="NodeHasSufficientPID" Apr 17 09:11:15.761639 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.761625 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-147.ec2.internal" Apr 17 09:11:15.761693 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.761649 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 09:11:15.762394 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.762377 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-147.ec2.internal" event="NodeHasSufficientMemory" Apr 17 09:11:15.762453 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.762402 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-147.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 09:11:15.762453 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.762414 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-147.ec2.internal" event="NodeHasSufficientPID" Apr 17 09:11:15.788543 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:15.788519 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-147.ec2.internal\" not found" node="ip-10-0-130-147.ec2.internal" Apr 17 09:11:15.793069 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:15.793053 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-147.ec2.internal\" not found" node="ip-10-0-130-147.ec2.internal" Apr 17 09:11:15.810299 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:15.810278 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-147.ec2.internal\" not found" Apr 17 09:11:15.835565 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.835540 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a8447471fcb8cd989806ca435d93d03c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-147.ec2.internal\" (UID: \"a8447471fcb8cd989806ca435d93d03c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-147.ec2.internal" Apr 17 09:11:15.835656 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.835568 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a8447471fcb8cd989806ca435d93d03c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-147.ec2.internal\" (UID: \"a8447471fcb8cd989806ca435d93d03c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-147.ec2.internal" Apr 17 09:11:15.835656 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.835587 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2e42dd9ccf4c8146ae96ed62e2dab724-config\") pod \"kube-apiserver-proxy-ip-10-0-130-147.ec2.internal\" (UID: \"2e42dd9ccf4c8146ae96ed62e2dab724\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-147.ec2.internal" Apr 17 09:11:15.911040 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:15.911013 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-147.ec2.internal\" not found" Apr 17 09:11:15.936431 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.936403 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2e42dd9ccf4c8146ae96ed62e2dab724-config\") pod \"kube-apiserver-proxy-ip-10-0-130-147.ec2.internal\" (UID: \"2e42dd9ccf4c8146ae96ed62e2dab724\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-147.ec2.internal" Apr 17 09:11:15.936490 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.936433 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a8447471fcb8cd989806ca435d93d03c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-147.ec2.internal\" (UID: \"a8447471fcb8cd989806ca435d93d03c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-147.ec2.internal" Apr 17 09:11:15.936490 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.936455 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a8447471fcb8cd989806ca435d93d03c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-147.ec2.internal\" (UID: \"a8447471fcb8cd989806ca435d93d03c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-147.ec2.internal" Apr 17 09:11:15.936562 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.936492 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a8447471fcb8cd989806ca435d93d03c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-147.ec2.internal\" (UID: \"a8447471fcb8cd989806ca435d93d03c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-147.ec2.internal" Apr 17 09:11:15.936562 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.936493 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a8447471fcb8cd989806ca435d93d03c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-147.ec2.internal\" (UID: \"a8447471fcb8cd989806ca435d93d03c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-147.ec2.internal" Apr 17 09:11:15.936562 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:15.936521 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2e42dd9ccf4c8146ae96ed62e2dab724-config\") pod \"kube-apiserver-proxy-ip-10-0-130-147.ec2.internal\" (UID: \"2e42dd9ccf4c8146ae96ed62e2dab724\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-147.ec2.internal" Apr 17 09:11:16.011671 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:16.011635 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-147.ec2.internal\" not found" Apr 17 09:11:16.090344 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.090319 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-147.ec2.internal" Apr 17 09:11:16.095828 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.095805 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-147.ec2.internal" Apr 17 09:11:16.111782 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:16.111749 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-147.ec2.internal\" not found" Apr 17 09:11:16.212605 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:16.212557 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-147.ec2.internal\" not found" Apr 17 09:11:16.313200 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:16.313124 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-147.ec2.internal\" not found" Apr 17 09:11:16.369306 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.369283 2578 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 09:11:16.433112 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.433089 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-147.ec2.internal" Apr 17 09:11:16.444724 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.444703 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 09:11:16.446221 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.446208 2578 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 09:11:16.446326 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.446312 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 09:11:16.446371 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.446343 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-147.ec2.internal" Apr 17 09:11:16.446371 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.446357 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 09:11:16.446371 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.446361 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 09:11:16.470202 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.470179 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 09:11:16.512417 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.512393 2578 apiserver.go:52] "Watching apiserver" Apr 17 09:11:16.522235 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.522213 2578 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 09:11:16.523303 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.523283 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-fffcg","openshift-image-registry/node-ca-mwm8s","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-147.ec2.internal","openshift-multus/multus-4p8q2","openshift-multus/multus-additional-cni-plugins-7hzg2","openshift-network-diagnostics/network-check-target-vptzp","kube-system/konnectivity-agent-ht9mz","openshift-dns/node-resolver-7lzx7","openshift-multus/network-metrics-daemon-4h6v9","openshift-network-operator/iptables-alerter-tm6nr","openshift-ovn-kubernetes/ovnkube-node-w5vps","kube-system/kube-apiserver-proxy-ip-10-0-130-147.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmhtf"] Apr 17 09:11:16.525522 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.525506 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-ht9mz" Apr 17 09:11:16.526617 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.526603 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mwm8s" Apr 17 09:11:16.527705 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.527685 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.528152 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.528133 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-jksqb\"" Apr 17 09:11:16.528313 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.528298 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 09:11:16.528624 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.528610 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 09:11:16.528766 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.528754 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7hzg2" Apr 17 09:11:16.529610 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.529590 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-wnmh2\"" Apr 17 09:11:16.529610 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.529600 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 09:11:16.529610 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.529608 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 09:11:16.529871 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.529670 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 09:11:16.529871 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.529770 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vptzp" Apr 17 09:11:16.529956 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.529885 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-fffcg" Apr 17 09:11:16.530000 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:16.529825 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vptzp" podUID="555d9d60-af04-44d3-b6cc-9af0c1398acd" Apr 17 09:11:16.530208 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.530187 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 09:11:16.530208 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.530205 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 09:11:16.530394 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.530381 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-jh7bg\"" Apr 17 09:11:16.530394 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.530388 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 09:11:16.530593 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.530582 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 09:11:16.530932 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.530919 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7lzx7" Apr 17 09:11:16.531294 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.531280 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 09:11:16.531357 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.531310 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 09:11:16.531661 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.531641 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 09:11:16.531731 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.531684 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-lhfqr\"" Apr 17 09:11:16.531881 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.531830 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 09:06:15 +0000 UTC" deadline="2028-01-05 22:51:18.965395212 +0000 UTC" Apr 17 09:11:16.531951 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.531883 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15085h40m2.433515929s" Apr 17 09:11:16.532040 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.532021 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4h6v9" Apr 17 09:11:16.532125 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:16.532104 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4h6v9" podUID="ea2ee429-d7fa-4703-99bd-5d963ebab30c" Apr 17 09:11:16.532891 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.532868 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 09:11:16.532975 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.532914 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-p9r2n\"" Apr 17 09:11:16.533185 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.533169 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 09:11:16.533611 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.533593 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 09:11:16.533749 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.533733 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-wftd8\"" Apr 17 09:11:16.534551 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.533933 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 09:11:16.534551 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.534286 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-tm6nr" Apr 17 09:11:16.536326 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.536310 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.536711 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.536687 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 09:11:16.536903 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.536889 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-nkzcx\"" Apr 17 09:11:16.536971 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.536912 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 09:11:16.537039 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.536979 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 09:11:16.537509 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.537492 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmhtf" Apr 17 09:11:16.538672 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.538658 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 09:11:16.538890 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.538875 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 09:11:16.539113 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.539096 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 09:11:16.539192 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.539169 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-vr8x8\"" Apr 17 09:11:16.539315 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.539195 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 09:11:16.539315 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.539205 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 09:11:16.539315 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.539211 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 09:11:16.539480 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.539466 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e6550303-873c-4278-9d7c-1b6d17d5f9eb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7hzg2\" (UID: \"e6550303-873c-4278-9d7c-1b6d17d5f9eb\") " pod="openshift-multus/multus-additional-cni-plugins-7hzg2" Apr 17 09:11:16.539540 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.539497 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llcf8\" (UniqueName: \"kubernetes.io/projected/1f61bc12-5108-467a-9c7c-fc6b4db52b69-kube-api-access-llcf8\") pod \"tuned-fffcg\" (UID: \"1f61bc12-5108-467a-9c7c-fc6b4db52b69\") " pod="openshift-cluster-node-tuning-operator/tuned-fffcg" Apr 17 09:11:16.539540 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.539524 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/164bdfae-ab57-4679-8440-11f5f905aca9-os-release\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.539641 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.539559 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwgpm\" (UniqueName: \"kubernetes.io/projected/e6550303-873c-4278-9d7c-1b6d17d5f9eb-kube-api-access-wwgpm\") pod \"multus-additional-cni-plugins-7hzg2\" (UID: \"e6550303-873c-4278-9d7c-1b6d17d5f9eb\") " pod="openshift-multus/multus-additional-cni-plugins-7hzg2" Apr 17 09:11:16.539641 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.539585 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/1f61bc12-5108-467a-9c7c-fc6b4db52b69-etc-sysctl-conf\") pod \"tuned-fffcg\" (UID: \"1f61bc12-5108-467a-9c7c-fc6b4db52b69\") " pod="openshift-cluster-node-tuning-operator/tuned-fffcg" Apr 17 09:11:16.539641 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.539606 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/164bdfae-ab57-4679-8440-11f5f905aca9-multus-socket-dir-parent\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.539641 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.539631 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncn9m\" (UniqueName: \"kubernetes.io/projected/ea2ee429-d7fa-4703-99bd-5d963ebab30c-kube-api-access-ncn9m\") pod \"network-metrics-daemon-4h6v9\" (UID: \"ea2ee429-d7fa-4703-99bd-5d963ebab30c\") " pod="openshift-multus/network-metrics-daemon-4h6v9" Apr 17 09:11:16.539865 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.539655 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c58c74c9-b33b-45a4-b98a-2c99ab16bff9-iptables-alerter-script\") pod \"iptables-alerter-tm6nr\" (UID: \"c58c74c9-b33b-45a4-b98a-2c99ab16bff9\") " pod="openshift-network-operator/iptables-alerter-tm6nr" Apr 17 09:11:16.539865 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.539678 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mwmt\" (UniqueName: \"kubernetes.io/projected/c58c74c9-b33b-45a4-b98a-2c99ab16bff9-kube-api-access-6mwmt\") pod \"iptables-alerter-tm6nr\" (UID: \"c58c74c9-b33b-45a4-b98a-2c99ab16bff9\") " pod="openshift-network-operator/iptables-alerter-tm6nr" Apr 17 09:11:16.539865 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.539719 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/054fc5ee-b86e-42a4-85c2-322e7ca088cf-host\") pod \"node-ca-mwm8s\" (UID: \"054fc5ee-b86e-42a4-85c2-322e7ca088cf\") " pod="openshift-image-registry/node-ca-mwm8s" Apr 17 09:11:16.539865 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.539751 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e6550303-873c-4278-9d7c-1b6d17d5f9eb-cni-binary-copy\") pod \"multus-additional-cni-plugins-7hzg2\" (UID: \"e6550303-873c-4278-9d7c-1b6d17d5f9eb\") " pod="openshift-multus/multus-additional-cni-plugins-7hzg2" Apr 17 09:11:16.539865 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.539776 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e6550303-873c-4278-9d7c-1b6d17d5f9eb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7hzg2\" (UID: \"e6550303-873c-4278-9d7c-1b6d17d5f9eb\") " pod="openshift-multus/multus-additional-cni-plugins-7hzg2" Apr 17 09:11:16.539865 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.539808 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1f61bc12-5108-467a-9c7c-fc6b4db52b69-host\") pod \"tuned-fffcg\" (UID: \"1f61bc12-5108-467a-9c7c-fc6b4db52b69\") " pod="openshift-cluster-node-tuning-operator/tuned-fffcg" Apr 17 09:11:16.539865 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.539850 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/164bdfae-ab57-4679-8440-11f5f905aca9-host-run-netns\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.540280 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.539901 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/164bdfae-ab57-4679-8440-11f5f905aca9-host-var-lib-cni-bin\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.540280 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.539943 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8e70ed4d-e5a1-4a10-931b-32fe40414a5a-hosts-file\") pod \"node-resolver-7lzx7\" (UID: \"8e70ed4d-e5a1-4a10-931b-32fe40414a5a\") " pod="openshift-dns/node-resolver-7lzx7" Apr 17 09:11:16.540280 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.539968 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxf7t\" (UniqueName: \"kubernetes.io/projected/555d9d60-af04-44d3-b6cc-9af0c1398acd-kube-api-access-fxf7t\") pod \"network-check-target-vptzp\" (UID: \"555d9d60-af04-44d3-b6cc-9af0c1398acd\") " pod="openshift-network-diagnostics/network-check-target-vptzp" Apr 17 09:11:16.540280 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.539993 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/1f61bc12-5108-467a-9c7c-fc6b4db52b69-etc-sysconfig\") pod \"tuned-fffcg\" (UID: \"1f61bc12-5108-467a-9c7c-fc6b4db52b69\") " pod="openshift-cluster-node-tuning-operator/tuned-fffcg" Apr 17 09:11:16.540280 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.540031 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 09:11:16.540280 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.540057 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 09:11:16.540280 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.540073 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/164bdfae-ab57-4679-8440-11f5f905aca9-system-cni-dir\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.540280 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.540114 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/164bdfae-ab57-4679-8440-11f5f905aca9-multus-cni-dir\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.540280 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.540121 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 09:11:16.540280 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.540142 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/164bdfae-ab57-4679-8440-11f5f905aca9-cnibin\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.540280 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.540169 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c58c74c9-b33b-45a4-b98a-2c99ab16bff9-host-slash\") pod \"iptables-alerter-tm6nr\" (UID: \"c58c74c9-b33b-45a4-b98a-2c99ab16bff9\") " pod="openshift-network-operator/iptables-alerter-tm6nr" Apr 17 09:11:16.540280 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.540189 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e6550303-873c-4278-9d7c-1b6d17d5f9eb-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-7hzg2\" (UID: \"e6550303-873c-4278-9d7c-1b6d17d5f9eb\") " pod="openshift-multus/multus-additional-cni-plugins-7hzg2" Apr 17 09:11:16.540280 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.540231 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1f61bc12-5108-467a-9c7c-fc6b4db52b69-etc-kubernetes\") pod \"tuned-fffcg\" (UID: \"1f61bc12-5108-467a-9c7c-fc6b4db52b69\") " pod="openshift-cluster-node-tuning-operator/tuned-fffcg" Apr 17 09:11:16.540280 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.540263 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-4298b\"" Apr 17 09:11:16.540280 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.540268 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/1f61bc12-5108-467a-9c7c-fc6b4db52b69-etc-sysctl-d\") pod \"tuned-fffcg\" (UID: \"1f61bc12-5108-467a-9c7c-fc6b4db52b69\") " pod="openshift-cluster-node-tuning-operator/tuned-fffcg" Apr 17 09:11:16.540280 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.540287 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1f61bc12-5108-467a-9c7c-fc6b4db52b69-lib-modules\") pod \"tuned-fffcg\" (UID: \"1f61bc12-5108-467a-9c7c-fc6b4db52b69\") " pod="openshift-cluster-node-tuning-operator/tuned-fffcg" Apr 17 09:11:16.541174 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.540302 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1f61bc12-5108-467a-9c7c-fc6b4db52b69-tmp\") pod \"tuned-fffcg\" (UID: \"1f61bc12-5108-467a-9c7c-fc6b4db52b69\") " pod="openshift-cluster-node-tuning-operator/tuned-fffcg" Apr 17 09:11:16.541174 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.540317 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/164bdfae-ab57-4679-8440-11f5f905aca9-multus-conf-dir\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.541174 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.540334 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/164bdfae-ab57-4679-8440-11f5f905aca9-multus-daemon-config\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.541174 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.540361 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnrnj\" (UniqueName: \"kubernetes.io/projected/164bdfae-ab57-4679-8440-11f5f905aca9-kube-api-access-jnrnj\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.541174 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.540435 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e6550303-873c-4278-9d7c-1b6d17d5f9eb-system-cni-dir\") pod \"multus-additional-cni-plugins-7hzg2\" (UID: \"e6550303-873c-4278-9d7c-1b6d17d5f9eb\") " pod="openshift-multus/multus-additional-cni-plugins-7hzg2" Apr 17 09:11:16.541174 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.540458 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/1f61bc12-5108-467a-9c7c-fc6b4db52b69-etc-modprobe-d\") pod \"tuned-fffcg\" (UID: \"1f61bc12-5108-467a-9c7c-fc6b4db52b69\") " pod="openshift-cluster-node-tuning-operator/tuned-fffcg" Apr 17 09:11:16.541174 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.540473 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/1f61bc12-5108-467a-9c7c-fc6b4db52b69-etc-systemd\") pod \"tuned-fffcg\" (UID: \"1f61bc12-5108-467a-9c7c-fc6b4db52b69\") " pod="openshift-cluster-node-tuning-operator/tuned-fffcg" Apr 17 09:11:16.541174 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.540493 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/164bdfae-ab57-4679-8440-11f5f905aca9-host-var-lib-cni-multus\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.541174 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.540511 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/164bdfae-ab57-4679-8440-11f5f905aca9-host-run-multus-certs\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.541174 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.540526 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fk85\" (UniqueName: \"kubernetes.io/projected/8e70ed4d-e5a1-4a10-931b-32fe40414a5a-kube-api-access-2fk85\") pod \"node-resolver-7lzx7\" (UID: \"8e70ed4d-e5a1-4a10-931b-32fe40414a5a\") " pod="openshift-dns/node-resolver-7lzx7" Apr 17 09:11:16.541174 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.540548 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea2ee429-d7fa-4703-99bd-5d963ebab30c-metrics-certs\") pod \"network-metrics-daemon-4h6v9\" (UID: \"ea2ee429-d7fa-4703-99bd-5d963ebab30c\") " pod="openshift-multus/network-metrics-daemon-4h6v9" Apr 17 09:11:16.541174 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.540567 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5f2e0357-0335-4771-8c1a-7da849e626c2-konnectivity-ca\") pod \"konnectivity-agent-ht9mz\" (UID: \"5f2e0357-0335-4771-8c1a-7da849e626c2\") " pod="kube-system/konnectivity-agent-ht9mz" Apr 17 09:11:16.541174 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.540587 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1f61bc12-5108-467a-9c7c-fc6b4db52b69-sys\") pod \"tuned-fffcg\" (UID: \"1f61bc12-5108-467a-9c7c-fc6b4db52b69\") " pod="openshift-cluster-node-tuning-operator/tuned-fffcg" Apr 17 09:11:16.541174 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.540602 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1f61bc12-5108-467a-9c7c-fc6b4db52b69-var-lib-kubelet\") pod \"tuned-fffcg\" (UID: \"1f61bc12-5108-467a-9c7c-fc6b4db52b69\") " pod="openshift-cluster-node-tuning-operator/tuned-fffcg" Apr 17 09:11:16.541174 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.540631 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/164bdfae-ab57-4679-8440-11f5f905aca9-cni-binary-copy\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.541174 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.540647 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/164bdfae-ab57-4679-8440-11f5f905aca9-host-var-lib-kubelet\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.541174 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.540672 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/164bdfae-ab57-4679-8440-11f5f905aca9-hostroot\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.542063 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.540685 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/164bdfae-ab57-4679-8440-11f5f905aca9-etc-kubernetes\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.542063 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.540711 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/054fc5ee-b86e-42a4-85c2-322e7ca088cf-serviceca\") pod \"node-ca-mwm8s\" (UID: \"054fc5ee-b86e-42a4-85c2-322e7ca088cf\") " pod="openshift-image-registry/node-ca-mwm8s" Apr 17 09:11:16.542063 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.540731 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e6550303-873c-4278-9d7c-1b6d17d5f9eb-os-release\") pod \"multus-additional-cni-plugins-7hzg2\" (UID: \"e6550303-873c-4278-9d7c-1b6d17d5f9eb\") " pod="openshift-multus/multus-additional-cni-plugins-7hzg2" Apr 17 09:11:16.542063 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.540746 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1f61bc12-5108-467a-9c7c-fc6b4db52b69-run\") pod \"tuned-fffcg\" (UID: \"1f61bc12-5108-467a-9c7c-fc6b4db52b69\") " pod="openshift-cluster-node-tuning-operator/tuned-fffcg" Apr 17 09:11:16.542063 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.540768 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/1f61bc12-5108-467a-9c7c-fc6b4db52b69-etc-tuned\") pod \"tuned-fffcg\" (UID: \"1f61bc12-5108-467a-9c7c-fc6b4db52b69\") " pod="openshift-cluster-node-tuning-operator/tuned-fffcg" Apr 17 09:11:16.542063 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.540786 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/164bdfae-ab57-4679-8440-11f5f905aca9-host-run-k8s-cni-cncf-io\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.542063 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.540802 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8e70ed4d-e5a1-4a10-931b-32fe40414a5a-tmp-dir\") pod \"node-resolver-7lzx7\" (UID: \"8e70ed4d-e5a1-4a10-931b-32fe40414a5a\") " pod="openshift-dns/node-resolver-7lzx7" Apr 17 09:11:16.542063 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.540826 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5f2e0357-0335-4771-8c1a-7da849e626c2-agent-certs\") pod \"konnectivity-agent-ht9mz\" (UID: \"5f2e0357-0335-4771-8c1a-7da849e626c2\") " pod="kube-system/konnectivity-agent-ht9mz" Apr 17 09:11:16.542063 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.540862 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98kgz\" (UniqueName: \"kubernetes.io/projected/054fc5ee-b86e-42a4-85c2-322e7ca088cf-kube-api-access-98kgz\") pod \"node-ca-mwm8s\" (UID: \"054fc5ee-b86e-42a4-85c2-322e7ca088cf\") " pod="openshift-image-registry/node-ca-mwm8s" Apr 17 09:11:16.542063 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.540879 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e6550303-873c-4278-9d7c-1b6d17d5f9eb-cnibin\") pod \"multus-additional-cni-plugins-7hzg2\" (UID: \"e6550303-873c-4278-9d7c-1b6d17d5f9eb\") " pod="openshift-multus/multus-additional-cni-plugins-7hzg2" Apr 17 09:11:16.546728 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.546712 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 09:11:16.573931 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.573882 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-x6c9c" Apr 17 09:11:16.579673 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.579654 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-x6c9c" Apr 17 09:11:16.634316 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.634295 2578 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 09:11:16.641232 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.641211 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e6550303-873c-4278-9d7c-1b6d17d5f9eb-system-cni-dir\") pod \"multus-additional-cni-plugins-7hzg2\" (UID: \"e6550303-873c-4278-9d7c-1b6d17d5f9eb\") " pod="openshift-multus/multus-additional-cni-plugins-7hzg2" Apr 17 09:11:16.641309 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.641243 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/1f61bc12-5108-467a-9c7c-fc6b4db52b69-etc-modprobe-d\") pod \"tuned-fffcg\" (UID: \"1f61bc12-5108-467a-9c7c-fc6b4db52b69\") " pod="openshift-cluster-node-tuning-operator/tuned-fffcg" Apr 17 09:11:16.641309 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.641266 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/164bdfae-ab57-4679-8440-11f5f905aca9-host-run-multus-certs\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.641309 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.641303 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fdf7670d-1d61-4809-892c-ac96118b27f2-log-socket\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.641403 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.641320 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e6550303-873c-4278-9d7c-1b6d17d5f9eb-system-cni-dir\") pod \"multus-additional-cni-plugins-7hzg2\" (UID: \"e6550303-873c-4278-9d7c-1b6d17d5f9eb\") " pod="openshift-multus/multus-additional-cni-plugins-7hzg2" Apr 17 09:11:16.641403 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.641329 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5f2e0357-0335-4771-8c1a-7da849e626c2-konnectivity-ca\") pod \"konnectivity-agent-ht9mz\" (UID: \"5f2e0357-0335-4771-8c1a-7da849e626c2\") " pod="kube-system/konnectivity-agent-ht9mz" Apr 17 09:11:16.641403 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.641385 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/1f61bc12-5108-467a-9c7c-fc6b4db52b69-etc-modprobe-d\") pod \"tuned-fffcg\" (UID: \"1f61bc12-5108-467a-9c7c-fc6b4db52b69\") " pod="openshift-cluster-node-tuning-operator/tuned-fffcg" Apr 17 09:11:16.641403 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.641389 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1f61bc12-5108-467a-9c7c-fc6b4db52b69-sys\") pod \"tuned-fffcg\" (UID: \"1f61bc12-5108-467a-9c7c-fc6b4db52b69\") " pod="openshift-cluster-node-tuning-operator/tuned-fffcg" Apr 17 09:11:16.641514 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.641424 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1f61bc12-5108-467a-9c7c-fc6b4db52b69-var-lib-kubelet\") pod \"tuned-fffcg\" (UID: \"1f61bc12-5108-467a-9c7c-fc6b4db52b69\") " pod="openshift-cluster-node-tuning-operator/tuned-fffcg" Apr 17 09:11:16.641514 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.641435 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1f61bc12-5108-467a-9c7c-fc6b4db52b69-sys\") pod \"tuned-fffcg\" (UID: \"1f61bc12-5108-467a-9c7c-fc6b4db52b69\") " pod="openshift-cluster-node-tuning-operator/tuned-fffcg" Apr 17 09:11:16.641514 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.641442 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/164bdfae-ab57-4679-8440-11f5f905aca9-cni-binary-copy\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.641514 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.641460 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fdf7670d-1d61-4809-892c-ac96118b27f2-run-systemd\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.641514 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.641479 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fdf7670d-1d61-4809-892c-ac96118b27f2-ovnkube-script-lib\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.641514 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.641484 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1f61bc12-5108-467a-9c7c-fc6b4db52b69-var-lib-kubelet\") pod \"tuned-fffcg\" (UID: \"1f61bc12-5108-467a-9c7c-fc6b4db52b69\") " pod="openshift-cluster-node-tuning-operator/tuned-fffcg" Apr 17 09:11:16.641514 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.641496 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/054fc5ee-b86e-42a4-85c2-322e7ca088cf-serviceca\") pod \"node-ca-mwm8s\" (UID: \"054fc5ee-b86e-42a4-85c2-322e7ca088cf\") " pod="openshift-image-registry/node-ca-mwm8s" Apr 17 09:11:16.641514 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.641512 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1f61bc12-5108-467a-9c7c-fc6b4db52b69-run\") pod \"tuned-fffcg\" (UID: \"1f61bc12-5108-467a-9c7c-fc6b4db52b69\") " pod="openshift-cluster-node-tuning-operator/tuned-fffcg" Apr 17 09:11:16.641731 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.641527 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/164bdfae-ab57-4679-8440-11f5f905aca9-host-run-k8s-cni-cncf-io\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.641731 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.641531 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/164bdfae-ab57-4679-8440-11f5f905aca9-host-run-multus-certs\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.641731 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.641541 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8e70ed4d-e5a1-4a10-931b-32fe40414a5a-tmp-dir\") pod \"node-resolver-7lzx7\" (UID: \"8e70ed4d-e5a1-4a10-931b-32fe40414a5a\") " pod="openshift-dns/node-resolver-7lzx7" Apr 17 09:11:16.641731 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.641557 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/80319b72-ff10-4f90-bb63-b62eef3d559e-socket-dir\") pod \"aws-ebs-csi-driver-node-dmhtf\" (UID: \"80319b72-ff10-4f90-bb63-b62eef3d559e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmhtf" Apr 17 09:11:16.641731 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.641573 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdf7670d-1d61-4809-892c-ac96118b27f2-var-lib-openvswitch\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.641731 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.641587 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdf7670d-1d61-4809-892c-ac96118b27f2-host-run-ovn-kubernetes\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.641731 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.641604 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5f2e0357-0335-4771-8c1a-7da849e626c2-agent-certs\") pod \"konnectivity-agent-ht9mz\" (UID: \"5f2e0357-0335-4771-8c1a-7da849e626c2\") " pod="kube-system/konnectivity-agent-ht9mz" Apr 17 09:11:16.641731 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.641619 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e6550303-873c-4278-9d7c-1b6d17d5f9eb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7hzg2\" (UID: \"e6550303-873c-4278-9d7c-1b6d17d5f9eb\") " pod="openshift-multus/multus-additional-cni-plugins-7hzg2" Apr 17 09:11:16.641731 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.641634 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-llcf8\" (UniqueName: \"kubernetes.io/projected/1f61bc12-5108-467a-9c7c-fc6b4db52b69-kube-api-access-llcf8\") pod \"tuned-fffcg\" (UID: \"1f61bc12-5108-467a-9c7c-fc6b4db52b69\") " pod="openshift-cluster-node-tuning-operator/tuned-fffcg" Apr 17 09:11:16.641731 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.641650 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fdf7670d-1d61-4809-892c-ac96118b27f2-host-cni-bin\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.641731 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.641663 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fdf7670d-1d61-4809-892c-ac96118b27f2-host-cni-netd\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.641731 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.641676 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xq2t\" (UniqueName: \"kubernetes.io/projected/fdf7670d-1d61-4809-892c-ac96118b27f2-kube-api-access-2xq2t\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.641731 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.641693 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/1f61bc12-5108-467a-9c7c-fc6b4db52b69-etc-sysctl-conf\") pod \"tuned-fffcg\" (UID: \"1f61bc12-5108-467a-9c7c-fc6b4db52b69\") " pod="openshift-cluster-node-tuning-operator/tuned-fffcg" Apr 17 09:11:16.641731 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.641710 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ncn9m\" (UniqueName: \"kubernetes.io/projected/ea2ee429-d7fa-4703-99bd-5d963ebab30c-kube-api-access-ncn9m\") pod \"network-metrics-daemon-4h6v9\" (UID: \"ea2ee429-d7fa-4703-99bd-5d963ebab30c\") " pod="openshift-multus/network-metrics-daemon-4h6v9" Apr 17 09:11:16.641731 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.641724 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c58c74c9-b33b-45a4-b98a-2c99ab16bff9-iptables-alerter-script\") pod \"iptables-alerter-tm6nr\" (UID: \"c58c74c9-b33b-45a4-b98a-2c99ab16bff9\") " pod="openshift-network-operator/iptables-alerter-tm6nr" Apr 17 09:11:16.642311 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.641742 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6mwmt\" (UniqueName: \"kubernetes.io/projected/c58c74c9-b33b-45a4-b98a-2c99ab16bff9-kube-api-access-6mwmt\") pod \"iptables-alerter-tm6nr\" (UID: \"c58c74c9-b33b-45a4-b98a-2c99ab16bff9\") " pod="openshift-network-operator/iptables-alerter-tm6nr" Apr 17 09:11:16.642311 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.641759 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/80319b72-ff10-4f90-bb63-b62eef3d559e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dmhtf\" (UID: \"80319b72-ff10-4f90-bb63-b62eef3d559e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmhtf" Apr 17 09:11:16.642311 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.641774 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fdf7670d-1d61-4809-892c-ac96118b27f2-host-run-netns\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.642311 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.641788 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdf7670d-1d61-4809-892c-ac96118b27f2-run-openvswitch\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.642311 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.641803 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/054fc5ee-b86e-42a4-85c2-322e7ca088cf-host\") pod \"node-ca-mwm8s\" (UID: \"054fc5ee-b86e-42a4-85c2-322e7ca088cf\") " pod="openshift-image-registry/node-ca-mwm8s" Apr 17 09:11:16.642311 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.641818 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e6550303-873c-4278-9d7c-1b6d17d5f9eb-cni-binary-copy\") pod \"multus-additional-cni-plugins-7hzg2\" (UID: \"e6550303-873c-4278-9d7c-1b6d17d5f9eb\") " pod="openshift-multus/multus-additional-cni-plugins-7hzg2" Apr 17 09:11:16.642311 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.641874 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e6550303-873c-4278-9d7c-1b6d17d5f9eb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7hzg2\" (UID: \"e6550303-873c-4278-9d7c-1b6d17d5f9eb\") " pod="openshift-multus/multus-additional-cni-plugins-7hzg2" Apr 17 09:11:16.642311 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.641929 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/164bdfae-ab57-4679-8440-11f5f905aca9-host-run-netns\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.642311 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.641941 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/164bdfae-ab57-4679-8440-11f5f905aca9-host-run-k8s-cni-cncf-io\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.642311 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.641958 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8e70ed4d-e5a1-4a10-931b-32fe40414a5a-hosts-file\") pod \"node-resolver-7lzx7\" (UID: \"8e70ed4d-e5a1-4a10-931b-32fe40414a5a\") " pod="openshift-dns/node-resolver-7lzx7" Apr 17 09:11:16.642311 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.642004 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e6550303-873c-4278-9d7c-1b6d17d5f9eb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7hzg2\" (UID: \"e6550303-873c-4278-9d7c-1b6d17d5f9eb\") " pod="openshift-multus/multus-additional-cni-plugins-7hzg2" Apr 17 09:11:16.642311 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.642007 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/164bdfae-ab57-4679-8440-11f5f905aca9-host-run-netns\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.642311 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.642057 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/164bdfae-ab57-4679-8440-11f5f905aca9-cni-binary-copy\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.642311 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.642060 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/80319b72-ff10-4f90-bb63-b62eef3d559e-sys-fs\") pod \"aws-ebs-csi-driver-node-dmhtf\" (UID: \"80319b72-ff10-4f90-bb63-b62eef3d559e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmhtf" Apr 17 09:11:16.642311 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.642060 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/054fc5ee-b86e-42a4-85c2-322e7ca088cf-host\") pod \"node-ca-mwm8s\" (UID: \"054fc5ee-b86e-42a4-85c2-322e7ca088cf\") " pod="openshift-image-registry/node-ca-mwm8s" Apr 17 09:11:16.642311 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.642096 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fdf7670d-1d61-4809-892c-ac96118b27f2-systemd-units\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.642311 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.642121 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fdf7670d-1d61-4809-892c-ac96118b27f2-run-ovn\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.643024 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.642151 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e6550303-873c-4278-9d7c-1b6d17d5f9eb-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-7hzg2\" (UID: \"e6550303-873c-4278-9d7c-1b6d17d5f9eb\") " pod="openshift-multus/multus-additional-cni-plugins-7hzg2" Apr 17 09:11:16.643024 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.642178 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1f61bc12-5108-467a-9c7c-fc6b4db52b69-etc-kubernetes\") pod \"tuned-fffcg\" (UID: \"1f61bc12-5108-467a-9c7c-fc6b4db52b69\") " pod="openshift-cluster-node-tuning-operator/tuned-fffcg" Apr 17 09:11:16.643024 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.642202 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/1f61bc12-5108-467a-9c7c-fc6b4db52b69-etc-sysctl-d\") pod \"tuned-fffcg\" (UID: \"1f61bc12-5108-467a-9c7c-fc6b4db52b69\") " pod="openshift-cluster-node-tuning-operator/tuned-fffcg" Apr 17 09:11:16.643024 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.642227 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/164bdfae-ab57-4679-8440-11f5f905aca9-multus-conf-dir\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.643024 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.642280 2578 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 09:11:16.643024 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.642389 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/1f61bc12-5108-467a-9c7c-fc6b4db52b69-etc-sysctl-conf\") pod \"tuned-fffcg\" (UID: \"1f61bc12-5108-467a-9c7c-fc6b4db52b69\") " pod="openshift-cluster-node-tuning-operator/tuned-fffcg" Apr 17 09:11:16.643024 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.642465 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jnrnj\" (UniqueName: \"kubernetes.io/projected/164bdfae-ab57-4679-8440-11f5f905aca9-kube-api-access-jnrnj\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.643024 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.642533 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2fk85\" (UniqueName: \"kubernetes.io/projected/8e70ed4d-e5a1-4a10-931b-32fe40414a5a-kube-api-access-2fk85\") pod \"node-resolver-7lzx7\" (UID: \"8e70ed4d-e5a1-4a10-931b-32fe40414a5a\") " pod="openshift-dns/node-resolver-7lzx7" Apr 17 09:11:16.643024 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.642548 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e6550303-873c-4278-9d7c-1b6d17d5f9eb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7hzg2\" (UID: \"e6550303-873c-4278-9d7c-1b6d17d5f9eb\") " pod="openshift-multus/multus-additional-cni-plugins-7hzg2" Apr 17 09:11:16.643024 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.642564 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea2ee429-d7fa-4703-99bd-5d963ebab30c-metrics-certs\") pod \"network-metrics-daemon-4h6v9\" (UID: \"ea2ee429-d7fa-4703-99bd-5d963ebab30c\") " pod="openshift-multus/network-metrics-daemon-4h6v9" Apr 17 09:11:16.643024 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.642584 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8e70ed4d-e5a1-4a10-931b-32fe40414a5a-tmp-dir\") pod \"node-resolver-7lzx7\" (UID: \"8e70ed4d-e5a1-4a10-931b-32fe40414a5a\") " pod="openshift-dns/node-resolver-7lzx7" Apr 17 09:11:16.643024 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.642598 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fdf7670d-1d61-4809-892c-ac96118b27f2-ovnkube-config\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.643024 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.642603 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1f61bc12-5108-467a-9c7c-fc6b4db52b69-run\") pod \"tuned-fffcg\" (UID: \"1f61bc12-5108-467a-9c7c-fc6b4db52b69\") " pod="openshift-cluster-node-tuning-operator/tuned-fffcg" Apr 17 09:11:16.643024 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.642609 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/164bdfae-ab57-4679-8440-11f5f905aca9-multus-conf-dir\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.643024 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.642638 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/1f61bc12-5108-467a-9c7c-fc6b4db52b69-etc-systemd\") pod \"tuned-fffcg\" (UID: \"1f61bc12-5108-467a-9c7c-fc6b4db52b69\") " pod="openshift-cluster-node-tuning-operator/tuned-fffcg" Apr 17 09:11:16.643024 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.642559 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e6550303-873c-4278-9d7c-1b6d17d5f9eb-cni-binary-copy\") pod \"multus-additional-cni-plugins-7hzg2\" (UID: \"e6550303-873c-4278-9d7c-1b6d17d5f9eb\") " pod="openshift-multus/multus-additional-cni-plugins-7hzg2" Apr 17 09:11:16.643024 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.642674 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/164bdfae-ab57-4679-8440-11f5f905aca9-host-var-lib-cni-multus\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.643024 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.642690 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c58c74c9-b33b-45a4-b98a-2c99ab16bff9-iptables-alerter-script\") pod \"iptables-alerter-tm6nr\" (UID: \"c58c74c9-b33b-45a4-b98a-2c99ab16bff9\") " pod="openshift-network-operator/iptables-alerter-tm6nr" Apr 17 09:11:16.643815 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.642718 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1f61bc12-5108-467a-9c7c-fc6b4db52b69-etc-kubernetes\") pod \"tuned-fffcg\" (UID: \"1f61bc12-5108-467a-9c7c-fc6b4db52b69\") " pod="openshift-cluster-node-tuning-operator/tuned-fffcg" Apr 17 09:11:16.643815 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.642721 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/1f61bc12-5108-467a-9c7c-fc6b4db52b69-etc-sysctl-d\") pod \"tuned-fffcg\" (UID: \"1f61bc12-5108-467a-9c7c-fc6b4db52b69\") " pod="openshift-cluster-node-tuning-operator/tuned-fffcg" Apr 17 09:11:16.643815 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.642850 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8e70ed4d-e5a1-4a10-931b-32fe40414a5a-hosts-file\") pod \"node-resolver-7lzx7\" (UID: \"8e70ed4d-e5a1-4a10-931b-32fe40414a5a\") " pod="openshift-dns/node-resolver-7lzx7" Apr 17 09:11:16.643815 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.643251 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/054fc5ee-b86e-42a4-85c2-322e7ca088cf-serviceca\") pod \"node-ca-mwm8s\" (UID: \"054fc5ee-b86e-42a4-85c2-322e7ca088cf\") " pod="openshift-image-registry/node-ca-mwm8s" Apr 17 09:11:16.643815 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:16.642865 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:11:16.643815 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.642897 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/80319b72-ff10-4f90-bb63-b62eef3d559e-device-dir\") pod \"aws-ebs-csi-driver-node-dmhtf\" (UID: \"80319b72-ff10-4f90-bb63-b62eef3d559e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmhtf" Apr 17 09:11:16.643815 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.643298 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/164bdfae-ab57-4679-8440-11f5f905aca9-host-var-lib-kubelet\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.643815 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.642899 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/164bdfae-ab57-4679-8440-11f5f905aca9-host-var-lib-cni-multus\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.643815 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.643178 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e6550303-873c-4278-9d7c-1b6d17d5f9eb-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-7hzg2\" (UID: \"e6550303-873c-4278-9d7c-1b6d17d5f9eb\") " pod="openshift-multus/multus-additional-cni-plugins-7hzg2" Apr 17 09:11:16.643815 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.642919 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/1f61bc12-5108-467a-9c7c-fc6b4db52b69-etc-systemd\") pod \"tuned-fffcg\" (UID: \"1f61bc12-5108-467a-9c7c-fc6b4db52b69\") " pod="openshift-cluster-node-tuning-operator/tuned-fffcg" Apr 17 09:11:16.643815 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:16.643335 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea2ee429-d7fa-4703-99bd-5d963ebab30c-metrics-certs podName:ea2ee429-d7fa-4703-99bd-5d963ebab30c nodeName:}" failed. No retries permitted until 2026-04-17 09:11:17.143314264 +0000 UTC m=+2.045371636 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea2ee429-d7fa-4703-99bd-5d963ebab30c-metrics-certs") pod "network-metrics-daemon-4h6v9" (UID: "ea2ee429-d7fa-4703-99bd-5d963ebab30c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:11:16.643815 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.643354 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/164bdfae-ab57-4679-8440-11f5f905aca9-hostroot\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.643815 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.643383 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/164bdfae-ab57-4679-8440-11f5f905aca9-etc-kubernetes\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.643815 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.643355 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/164bdfae-ab57-4679-8440-11f5f905aca9-host-var-lib-kubelet\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.643815 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.643238 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5f2e0357-0335-4771-8c1a-7da849e626c2-konnectivity-ca\") pod \"konnectivity-agent-ht9mz\" (UID: \"5f2e0357-0335-4771-8c1a-7da849e626c2\") " pod="kube-system/konnectivity-agent-ht9mz" Apr 17 09:11:16.643815 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.643412 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdf7670d-1d61-4809-892c-ac96118b27f2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.643815 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.643440 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e6550303-873c-4278-9d7c-1b6d17d5f9eb-os-release\") pod \"multus-additional-cni-plugins-7hzg2\" (UID: \"e6550303-873c-4278-9d7c-1b6d17d5f9eb\") " pod="openshift-multus/multus-additional-cni-plugins-7hzg2" Apr 17 09:11:16.644590 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.643442 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/164bdfae-ab57-4679-8440-11f5f905aca9-etc-kubernetes\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.644590 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.643384 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/164bdfae-ab57-4679-8440-11f5f905aca9-hostroot\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.644590 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.643484 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/1f61bc12-5108-467a-9c7c-fc6b4db52b69-etc-tuned\") pod \"tuned-fffcg\" (UID: \"1f61bc12-5108-467a-9c7c-fc6b4db52b69\") " pod="openshift-cluster-node-tuning-operator/tuned-fffcg" Apr 17 09:11:16.644590 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.643547 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fdf7670d-1d61-4809-892c-ac96118b27f2-ovn-node-metrics-cert\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.644590 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.643499 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e6550303-873c-4278-9d7c-1b6d17d5f9eb-os-release\") pod \"multus-additional-cni-plugins-7hzg2\" (UID: \"e6550303-873c-4278-9d7c-1b6d17d5f9eb\") " pod="openshift-multus/multus-additional-cni-plugins-7hzg2" Apr 17 09:11:16.644590 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.643579 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-98kgz\" (UniqueName: \"kubernetes.io/projected/054fc5ee-b86e-42a4-85c2-322e7ca088cf-kube-api-access-98kgz\") pod \"node-ca-mwm8s\" (UID: \"054fc5ee-b86e-42a4-85c2-322e7ca088cf\") " pod="openshift-image-registry/node-ca-mwm8s" Apr 17 09:11:16.644590 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.643606 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e6550303-873c-4278-9d7c-1b6d17d5f9eb-cnibin\") pod \"multus-additional-cni-plugins-7hzg2\" (UID: \"e6550303-873c-4278-9d7c-1b6d17d5f9eb\") " pod="openshift-multus/multus-additional-cni-plugins-7hzg2" Apr 17 09:11:16.644590 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.643631 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/164bdfae-ab57-4679-8440-11f5f905aca9-os-release\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.644590 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.643659 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fdf7670d-1d61-4809-892c-ac96118b27f2-host-kubelet\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.644590 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.643681 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fdf7670d-1d61-4809-892c-ac96118b27f2-host-slash\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.644590 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.643703 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wwgpm\" (UniqueName: \"kubernetes.io/projected/e6550303-873c-4278-9d7c-1b6d17d5f9eb-kube-api-access-wwgpm\") pod \"multus-additional-cni-plugins-7hzg2\" (UID: \"e6550303-873c-4278-9d7c-1b6d17d5f9eb\") " pod="openshift-multus/multus-additional-cni-plugins-7hzg2" Apr 17 09:11:16.644590 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.643725 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/164bdfae-ab57-4679-8440-11f5f905aca9-multus-socket-dir-parent\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.644590 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.643746 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/80319b72-ff10-4f90-bb63-b62eef3d559e-registration-dir\") pod \"aws-ebs-csi-driver-node-dmhtf\" (UID: \"80319b72-ff10-4f90-bb63-b62eef3d559e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmhtf" Apr 17 09:11:16.644590 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.643767 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1f61bc12-5108-467a-9c7c-fc6b4db52b69-host\") pod \"tuned-fffcg\" (UID: \"1f61bc12-5108-467a-9c7c-fc6b4db52b69\") " pod="openshift-cluster-node-tuning-operator/tuned-fffcg" Apr 17 09:11:16.644590 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.643798 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/164bdfae-ab57-4679-8440-11f5f905aca9-host-var-lib-cni-bin\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.644590 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.643822 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5kjv\" (UniqueName: \"kubernetes.io/projected/80319b72-ff10-4f90-bb63-b62eef3d559e-kube-api-access-j5kjv\") pod \"aws-ebs-csi-driver-node-dmhtf\" (UID: \"80319b72-ff10-4f90-bb63-b62eef3d559e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmhtf" Apr 17 09:11:16.644590 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.643858 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdf7670d-1d61-4809-892c-ac96118b27f2-etc-openvswitch\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.645345 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.643876 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/164bdfae-ab57-4679-8440-11f5f905aca9-multus-socket-dir-parent\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.645345 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.643884 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/164bdfae-ab57-4679-8440-11f5f905aca9-os-release\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.645345 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.643878 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fdf7670d-1d61-4809-892c-ac96118b27f2-node-log\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.645345 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.643935 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e6550303-873c-4278-9d7c-1b6d17d5f9eb-cnibin\") pod \"multus-additional-cni-plugins-7hzg2\" (UID: \"e6550303-873c-4278-9d7c-1b6d17d5f9eb\") " pod="openshift-multus/multus-additional-cni-plugins-7hzg2" Apr 17 09:11:16.645345 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.643945 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fxf7t\" (UniqueName: \"kubernetes.io/projected/555d9d60-af04-44d3-b6cc-9af0c1398acd-kube-api-access-fxf7t\") pod \"network-check-target-vptzp\" (UID: \"555d9d60-af04-44d3-b6cc-9af0c1398acd\") " pod="openshift-network-diagnostics/network-check-target-vptzp" Apr 17 09:11:16.645345 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.644085 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/1f61bc12-5108-467a-9c7c-fc6b4db52b69-etc-sysconfig\") pod \"tuned-fffcg\" (UID: \"1f61bc12-5108-467a-9c7c-fc6b4db52b69\") " pod="openshift-cluster-node-tuning-operator/tuned-fffcg" Apr 17 09:11:16.645345 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.644085 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1f61bc12-5108-467a-9c7c-fc6b4db52b69-host\") pod \"tuned-fffcg\" (UID: \"1f61bc12-5108-467a-9c7c-fc6b4db52b69\") " pod="openshift-cluster-node-tuning-operator/tuned-fffcg" Apr 17 09:11:16.645345 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.644113 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/164bdfae-ab57-4679-8440-11f5f905aca9-host-var-lib-cni-bin\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.645345 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.644224 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/164bdfae-ab57-4679-8440-11f5f905aca9-system-cni-dir\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.645345 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.644249 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/1f61bc12-5108-467a-9c7c-fc6b4db52b69-etc-sysconfig\") pod \"tuned-fffcg\" (UID: \"1f61bc12-5108-467a-9c7c-fc6b4db52b69\") " pod="openshift-cluster-node-tuning-operator/tuned-fffcg" Apr 17 09:11:16.645345 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.644307 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/164bdfae-ab57-4679-8440-11f5f905aca9-multus-cni-dir\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.645345 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.644366 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/164bdfae-ab57-4679-8440-11f5f905aca9-system-cni-dir\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.645345 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.644430 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/164bdfae-ab57-4679-8440-11f5f905aca9-multus-cni-dir\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.645345 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.644470 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/164bdfae-ab57-4679-8440-11f5f905aca9-cnibin\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.645345 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.644494 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c58c74c9-b33b-45a4-b98a-2c99ab16bff9-host-slash\") pod \"iptables-alerter-tm6nr\" (UID: \"c58c74c9-b33b-45a4-b98a-2c99ab16bff9\") " pod="openshift-network-operator/iptables-alerter-tm6nr" Apr 17 09:11:16.645345 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.644513 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/80319b72-ff10-4f90-bb63-b62eef3d559e-etc-selinux\") pod \"aws-ebs-csi-driver-node-dmhtf\" (UID: \"80319b72-ff10-4f90-bb63-b62eef3d559e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmhtf" Apr 17 09:11:16.645345 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.644519 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/164bdfae-ab57-4679-8440-11f5f905aca9-cnibin\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.645345 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.644534 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fdf7670d-1d61-4809-892c-ac96118b27f2-env-overrides\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.646069 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.644561 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c58c74c9-b33b-45a4-b98a-2c99ab16bff9-host-slash\") pod \"iptables-alerter-tm6nr\" (UID: \"c58c74c9-b33b-45a4-b98a-2c99ab16bff9\") " pod="openshift-network-operator/iptables-alerter-tm6nr" Apr 17 09:11:16.646069 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.644578 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1f61bc12-5108-467a-9c7c-fc6b4db52b69-lib-modules\") pod \"tuned-fffcg\" (UID: \"1f61bc12-5108-467a-9c7c-fc6b4db52b69\") " pod="openshift-cluster-node-tuning-operator/tuned-fffcg" Apr 17 09:11:16.646069 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.644615 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1f61bc12-5108-467a-9c7c-fc6b4db52b69-tmp\") pod \"tuned-fffcg\" (UID: \"1f61bc12-5108-467a-9c7c-fc6b4db52b69\") " pod="openshift-cluster-node-tuning-operator/tuned-fffcg" Apr 17 09:11:16.646069 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.644642 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/164bdfae-ab57-4679-8440-11f5f905aca9-multus-daemon-config\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.646069 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.644706 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1f61bc12-5108-467a-9c7c-fc6b4db52b69-lib-modules\") pod \"tuned-fffcg\" (UID: \"1f61bc12-5108-467a-9c7c-fc6b4db52b69\") " pod="openshift-cluster-node-tuning-operator/tuned-fffcg" Apr 17 09:11:16.646069 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.645224 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/164bdfae-ab57-4679-8440-11f5f905aca9-multus-daemon-config\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.646254 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.646140 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5f2e0357-0335-4771-8c1a-7da849e626c2-agent-certs\") pod \"konnectivity-agent-ht9mz\" (UID: \"5f2e0357-0335-4771-8c1a-7da849e626c2\") " pod="kube-system/konnectivity-agent-ht9mz" Apr 17 09:11:16.646355 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.646336 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/1f61bc12-5108-467a-9c7c-fc6b4db52b69-etc-tuned\") pod \"tuned-fffcg\" (UID: \"1f61bc12-5108-467a-9c7c-fc6b4db52b69\") " pod="openshift-cluster-node-tuning-operator/tuned-fffcg" Apr 17 09:11:16.647210 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.647191 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1f61bc12-5108-467a-9c7c-fc6b4db52b69-tmp\") pod \"tuned-fffcg\" (UID: \"1f61bc12-5108-467a-9c7c-fc6b4db52b69\") " pod="openshift-cluster-node-tuning-operator/tuned-fffcg" Apr 17 09:11:16.650590 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.650571 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncn9m\" (UniqueName: \"kubernetes.io/projected/ea2ee429-d7fa-4703-99bd-5d963ebab30c-kube-api-access-ncn9m\") pod \"network-metrics-daemon-4h6v9\" (UID: \"ea2ee429-d7fa-4703-99bd-5d963ebab30c\") " pod="openshift-multus/network-metrics-daemon-4h6v9" Apr 17 09:11:16.651122 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.651095 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnrnj\" (UniqueName: \"kubernetes.io/projected/164bdfae-ab57-4679-8440-11f5f905aca9-kube-api-access-jnrnj\") pod \"multus-4p8q2\" (UID: \"164bdfae-ab57-4679-8440-11f5f905aca9\") " pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.653877 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.653819 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mwmt\" (UniqueName: \"kubernetes.io/projected/c58c74c9-b33b-45a4-b98a-2c99ab16bff9-kube-api-access-6mwmt\") pod \"iptables-alerter-tm6nr\" (UID: \"c58c74c9-b33b-45a4-b98a-2c99ab16bff9\") " pod="openshift-network-operator/iptables-alerter-tm6nr" Apr 17 09:11:16.655198 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:16.654982 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 09:11:16.655198 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:16.655010 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 09:11:16.655198 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:16.655024 2578 projected.go:194] Error preparing data for projected volume kube-api-access-fxf7t for pod openshift-network-diagnostics/network-check-target-vptzp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:11:16.655198 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:16.655081 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/555d9d60-af04-44d3-b6cc-9af0c1398acd-kube-api-access-fxf7t podName:555d9d60-af04-44d3-b6cc-9af0c1398acd nodeName:}" failed. No retries permitted until 2026-04-17 09:11:17.155063128 +0000 UTC m=+2.057120492 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-fxf7t" (UniqueName: "kubernetes.io/projected/555d9d60-af04-44d3-b6cc-9af0c1398acd-kube-api-access-fxf7t") pod "network-check-target-vptzp" (UID: "555d9d60-af04-44d3-b6cc-9af0c1398acd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:11:16.656194 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.656170 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fk85\" (UniqueName: \"kubernetes.io/projected/8e70ed4d-e5a1-4a10-931b-32fe40414a5a-kube-api-access-2fk85\") pod \"node-resolver-7lzx7\" (UID: \"8e70ed4d-e5a1-4a10-931b-32fe40414a5a\") " pod="openshift-dns/node-resolver-7lzx7" Apr 17 09:11:16.656281 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.656236 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-llcf8\" (UniqueName: \"kubernetes.io/projected/1f61bc12-5108-467a-9c7c-fc6b4db52b69-kube-api-access-llcf8\") pod \"tuned-fffcg\" (UID: \"1f61bc12-5108-467a-9c7c-fc6b4db52b69\") " pod="openshift-cluster-node-tuning-operator/tuned-fffcg" Apr 17 09:11:16.656800 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.656781 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-98kgz\" (UniqueName: \"kubernetes.io/projected/054fc5ee-b86e-42a4-85c2-322e7ca088cf-kube-api-access-98kgz\") pod \"node-ca-mwm8s\" (UID: \"054fc5ee-b86e-42a4-85c2-322e7ca088cf\") " pod="openshift-image-registry/node-ca-mwm8s" Apr 17 09:11:16.657265 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.657246 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwgpm\" (UniqueName: \"kubernetes.io/projected/e6550303-873c-4278-9d7c-1b6d17d5f9eb-kube-api-access-wwgpm\") pod \"multus-additional-cni-plugins-7hzg2\" (UID: \"e6550303-873c-4278-9d7c-1b6d17d5f9eb\") " pod="openshift-multus/multus-additional-cni-plugins-7hzg2" Apr 17 09:11:16.661095 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.661053 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-tm6nr" Apr 17 09:11:16.697418 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:16.697386 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8447471fcb8cd989806ca435d93d03c.slice/crio-cf8aa925376089e66cf12fed414bfd68824eb25c4275fa54bb65f7fcb96636e7 WatchSource:0}: Error finding container cf8aa925376089e66cf12fed414bfd68824eb25c4275fa54bb65f7fcb96636e7: Status 404 returned error can't find the container with id cf8aa925376089e66cf12fed414bfd68824eb25c4275fa54bb65f7fcb96636e7 Apr 17 09:11:16.697955 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:16.697927 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e42dd9ccf4c8146ae96ed62e2dab724.slice/crio-0058f48927077c0af13e8648a7de2311dbbc6146f04e9f4e8e18348302f683ba WatchSource:0}: Error finding container 0058f48927077c0af13e8648a7de2311dbbc6146f04e9f4e8e18348302f683ba: Status 404 returned error can't find the container with id 0058f48927077c0af13e8648a7de2311dbbc6146f04e9f4e8e18348302f683ba Apr 17 09:11:16.703053 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.703036 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 09:11:16.745433 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.745407 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fdf7670d-1d61-4809-892c-ac96118b27f2-ovnkube-config\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.745433 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.745435 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/80319b72-ff10-4f90-bb63-b62eef3d559e-device-dir\") pod \"aws-ebs-csi-driver-node-dmhtf\" (UID: \"80319b72-ff10-4f90-bb63-b62eef3d559e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmhtf" Apr 17 09:11:16.745582 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.745454 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdf7670d-1d61-4809-892c-ac96118b27f2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.745582 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.745480 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fdf7670d-1d61-4809-892c-ac96118b27f2-ovn-node-metrics-cert\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.745582 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.745530 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/80319b72-ff10-4f90-bb63-b62eef3d559e-device-dir\") pod \"aws-ebs-csi-driver-node-dmhtf\" (UID: \"80319b72-ff10-4f90-bb63-b62eef3d559e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmhtf" Apr 17 09:11:16.745582 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.745556 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdf7670d-1d61-4809-892c-ac96118b27f2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.745766 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.745583 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fdf7670d-1d61-4809-892c-ac96118b27f2-host-kubelet\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.745766 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.745606 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fdf7670d-1d61-4809-892c-ac96118b27f2-host-slash\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.745766 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.745631 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/80319b72-ff10-4f90-bb63-b62eef3d559e-registration-dir\") pod \"aws-ebs-csi-driver-node-dmhtf\" (UID: \"80319b72-ff10-4f90-bb63-b62eef3d559e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmhtf" Apr 17 09:11:16.745766 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.745656 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fdf7670d-1d61-4809-892c-ac96118b27f2-host-kubelet\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.745766 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.745676 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fdf7670d-1d61-4809-892c-ac96118b27f2-host-slash\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.745766 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.745685 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/80319b72-ff10-4f90-bb63-b62eef3d559e-registration-dir\") pod \"aws-ebs-csi-driver-node-dmhtf\" (UID: \"80319b72-ff10-4f90-bb63-b62eef3d559e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmhtf" Apr 17 09:11:16.745766 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.745714 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j5kjv\" (UniqueName: \"kubernetes.io/projected/80319b72-ff10-4f90-bb63-b62eef3d559e-kube-api-access-j5kjv\") pod \"aws-ebs-csi-driver-node-dmhtf\" (UID: \"80319b72-ff10-4f90-bb63-b62eef3d559e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmhtf" Apr 17 09:11:16.745766 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.745739 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdf7670d-1d61-4809-892c-ac96118b27f2-etc-openvswitch\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.745766 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.745763 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fdf7670d-1d61-4809-892c-ac96118b27f2-node-log\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.746155 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.745804 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/80319b72-ff10-4f90-bb63-b62eef3d559e-etc-selinux\") pod \"aws-ebs-csi-driver-node-dmhtf\" (UID: \"80319b72-ff10-4f90-bb63-b62eef3d559e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmhtf" Apr 17 09:11:16.746155 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.745825 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdf7670d-1d61-4809-892c-ac96118b27f2-etc-openvswitch\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.746155 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.745830 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fdf7670d-1d61-4809-892c-ac96118b27f2-env-overrides\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.746155 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.745878 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fdf7670d-1d61-4809-892c-ac96118b27f2-node-log\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.746155 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.745883 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fdf7670d-1d61-4809-892c-ac96118b27f2-log-socket\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.746155 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.745916 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fdf7670d-1d61-4809-892c-ac96118b27f2-log-socket\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.746155 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.745933 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fdf7670d-1d61-4809-892c-ac96118b27f2-run-systemd\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.746155 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.745955 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fdf7670d-1d61-4809-892c-ac96118b27f2-run-systemd\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.746155 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.745934 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/80319b72-ff10-4f90-bb63-b62eef3d559e-etc-selinux\") pod \"aws-ebs-csi-driver-node-dmhtf\" (UID: \"80319b72-ff10-4f90-bb63-b62eef3d559e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmhtf" Apr 17 09:11:16.746155 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.745968 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fdf7670d-1d61-4809-892c-ac96118b27f2-ovnkube-script-lib\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.746155 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.746000 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/80319b72-ff10-4f90-bb63-b62eef3d559e-socket-dir\") pod \"aws-ebs-csi-driver-node-dmhtf\" (UID: \"80319b72-ff10-4f90-bb63-b62eef3d559e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmhtf" Apr 17 09:11:16.746155 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.746026 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdf7670d-1d61-4809-892c-ac96118b27f2-var-lib-openvswitch\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.746155 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.746042 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdf7670d-1d61-4809-892c-ac96118b27f2-host-run-ovn-kubernetes\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.746155 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.746061 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fdf7670d-1d61-4809-892c-ac96118b27f2-host-cni-bin\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.746155 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.746078 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fdf7670d-1d61-4809-892c-ac96118b27f2-host-cni-netd\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.746155 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.746101 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2xq2t\" (UniqueName: \"kubernetes.io/projected/fdf7670d-1d61-4809-892c-ac96118b27f2-kube-api-access-2xq2t\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.746155 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.746103 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdf7670d-1d61-4809-892c-ac96118b27f2-var-lib-openvswitch\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.746819 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.746121 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fdf7670d-1d61-4809-892c-ac96118b27f2-host-run-ovn-kubernetes\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.746819 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.746157 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/80319b72-ff10-4f90-bb63-b62eef3d559e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dmhtf\" (UID: \"80319b72-ff10-4f90-bb63-b62eef3d559e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmhtf" Apr 17 09:11:16.746819 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.746155 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fdf7670d-1d61-4809-892c-ac96118b27f2-host-cni-bin\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.746819 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.746185 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fdf7670d-1d61-4809-892c-ac96118b27f2-host-run-netns\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.746819 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.746199 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/80319b72-ff10-4f90-bb63-b62eef3d559e-socket-dir\") pod \"aws-ebs-csi-driver-node-dmhtf\" (UID: \"80319b72-ff10-4f90-bb63-b62eef3d559e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmhtf" Apr 17 09:11:16.746819 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.746222 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fdf7670d-1d61-4809-892c-ac96118b27f2-env-overrides\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.746819 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.746227 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdf7670d-1d61-4809-892c-ac96118b27f2-run-openvswitch\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.746819 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.746258 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/80319b72-ff10-4f90-bb63-b62eef3d559e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dmhtf\" (UID: \"80319b72-ff10-4f90-bb63-b62eef3d559e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmhtf" Apr 17 09:11:16.746819 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.746265 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/80319b72-ff10-4f90-bb63-b62eef3d559e-sys-fs\") pod \"aws-ebs-csi-driver-node-dmhtf\" (UID: \"80319b72-ff10-4f90-bb63-b62eef3d559e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmhtf" Apr 17 09:11:16.746819 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.746260 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fdf7670d-1d61-4809-892c-ac96118b27f2-host-cni-netd\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.746819 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.746289 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fdf7670d-1d61-4809-892c-ac96118b27f2-systemd-units\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.746819 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.746315 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fdf7670d-1d61-4809-892c-ac96118b27f2-run-ovn\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.746819 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.746318 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fdf7670d-1d61-4809-892c-ac96118b27f2-host-run-netns\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.746819 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.746327 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/80319b72-ff10-4f90-bb63-b62eef3d559e-sys-fs\") pod \"aws-ebs-csi-driver-node-dmhtf\" (UID: \"80319b72-ff10-4f90-bb63-b62eef3d559e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmhtf" Apr 17 09:11:16.746819 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.746291 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fdf7670d-1d61-4809-892c-ac96118b27f2-run-openvswitch\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.746819 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.746350 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fdf7670d-1d61-4809-892c-ac96118b27f2-systemd-units\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.746819 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.746370 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fdf7670d-1d61-4809-892c-ac96118b27f2-run-ovn\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.747320 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.746478 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fdf7670d-1d61-4809-892c-ac96118b27f2-ovnkube-script-lib\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.747320 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.746599 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fdf7670d-1d61-4809-892c-ac96118b27f2-ovnkube-config\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.747801 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.747769 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fdf7670d-1d61-4809-892c-ac96118b27f2-ovn-node-metrics-cert\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.755327 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.755300 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xq2t\" (UniqueName: \"kubernetes.io/projected/fdf7670d-1d61-4809-892c-ac96118b27f2-kube-api-access-2xq2t\") pod \"ovnkube-node-w5vps\" (UID: \"fdf7670d-1d61-4809-892c-ac96118b27f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.755327 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.755314 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5kjv\" (UniqueName: \"kubernetes.io/projected/80319b72-ff10-4f90-bb63-b62eef3d559e-kube-api-access-j5kjv\") pod \"aws-ebs-csi-driver-node-dmhtf\" (UID: \"80319b72-ff10-4f90-bb63-b62eef3d559e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmhtf" Apr 17 09:11:16.849282 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.849208 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 09:11:16.858544 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.858520 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-ht9mz" Apr 17 09:11:16.864641 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:16.864618 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f2e0357_0335_4771_8c1a_7da849e626c2.slice/crio-ff136118458e40ffa891475d4d9a5d166fb77697db93140852576857222c400f WatchSource:0}: Error finding container ff136118458e40ffa891475d4d9a5d166fb77697db93140852576857222c400f: Status 404 returned error can't find the container with id ff136118458e40ffa891475d4d9a5d166fb77697db93140852576857222c400f Apr 17 09:11:16.877351 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.877332 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mwm8s" Apr 17 09:11:16.883429 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:16.883403 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod054fc5ee_b86e_42a4_85c2_322e7ca088cf.slice/crio-8c93475d825bb2e939e8696b1a703da0bbea9e8dcf84bcdb0669e6dc1f6c59cf WatchSource:0}: Error finding container 8c93475d825bb2e939e8696b1a703da0bbea9e8dcf84bcdb0669e6dc1f6c59cf: Status 404 returned error can't find the container with id 8c93475d825bb2e939e8696b1a703da0bbea9e8dcf84bcdb0669e6dc1f6c59cf Apr 17 09:11:16.886187 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.886173 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4p8q2" Apr 17 09:11:16.891613 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:16.891590 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod164bdfae_ab57_4679_8440_11f5f905aca9.slice/crio-4b77176c6bde43dd31a73a35b9b7fc506bc2037d5bad1d65d1f5641353b53972 WatchSource:0}: Error finding container 4b77176c6bde43dd31a73a35b9b7fc506bc2037d5bad1d65d1f5641353b53972: Status 404 returned error can't find the container with id 4b77176c6bde43dd31a73a35b9b7fc506bc2037d5bad1d65d1f5641353b53972 Apr 17 09:11:16.905220 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.905199 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7hzg2" Apr 17 09:11:16.910876 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:16.910855 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6550303_873c_4278_9d7c_1b6d17d5f9eb.slice/crio-1b81a70a18b306a3032825dd124cff795420f406a396a4b2a60b9aa9c8584857 WatchSource:0}: Error finding container 1b81a70a18b306a3032825dd124cff795420f406a396a4b2a60b9aa9c8584857: Status 404 returned error can't find the container with id 1b81a70a18b306a3032825dd124cff795420f406a396a4b2a60b9aa9c8584857 Apr 17 09:11:16.924767 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.924536 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-fffcg" Apr 17 09:11:16.929916 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:16.929891 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f61bc12_5108_467a_9c7c_fc6b4db52b69.slice/crio-52ce9e38761df9fe336282f364d67c0e011eb3570582487705218275ebed7139 WatchSource:0}: Error finding container 52ce9e38761df9fe336282f364d67c0e011eb3570582487705218275ebed7139: Status 404 returned error can't find the container with id 52ce9e38761df9fe336282f364d67c0e011eb3570582487705218275ebed7139 Apr 17 09:11:16.944115 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.944099 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7lzx7" Apr 17 09:11:16.949266 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:16.949246 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e70ed4d_e5a1_4a10_931b_32fe40414a5a.slice/crio-16e5b4df3fe71d6438a1d1294019e61ad6f95951e6cf0002bfd642e5bded0f32 WatchSource:0}: Error finding container 16e5b4df3fe71d6438a1d1294019e61ad6f95951e6cf0002bfd642e5bded0f32: Status 404 returned error can't find the container with id 16e5b4df3fe71d6438a1d1294019e61ad6f95951e6cf0002bfd642e5bded0f32 Apr 17 09:11:16.977370 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.977347 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:16.981901 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:16.981885 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmhtf" Apr 17 09:11:16.983304 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:16.983286 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdf7670d_1d61_4809_892c_ac96118b27f2.slice/crio-e6ad74acce5d71f907df2c73392e8d75551f09109371b363da99bf0fedba1391 WatchSource:0}: Error finding container e6ad74acce5d71f907df2c73392e8d75551f09109371b363da99bf0fedba1391: Status 404 returned error can't find the container with id e6ad74acce5d71f907df2c73392e8d75551f09109371b363da99bf0fedba1391 Apr 17 09:11:16.988391 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:16.988369 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80319b72_ff10_4f90_bb63_b62eef3d559e.slice/crio-9806b6d129b4419b77f382b8d5673cb9e1a3b622142470edfb4815ae6b6c2c78 WatchSource:0}: Error finding container 9806b6d129b4419b77f382b8d5673cb9e1a3b622142470edfb4815ae6b6c2c78: Status 404 returned error can't find the container with id 9806b6d129b4419b77f382b8d5673cb9e1a3b622142470edfb4815ae6b6c2c78 Apr 17 09:11:17.150137 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:17.150103 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea2ee429-d7fa-4703-99bd-5d963ebab30c-metrics-certs\") pod \"network-metrics-daemon-4h6v9\" (UID: \"ea2ee429-d7fa-4703-99bd-5d963ebab30c\") " pod="openshift-multus/network-metrics-daemon-4h6v9" Apr 17 09:11:17.150278 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:17.150258 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:11:17.150348 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:17.150326 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea2ee429-d7fa-4703-99bd-5d963ebab30c-metrics-certs podName:ea2ee429-d7fa-4703-99bd-5d963ebab30c nodeName:}" failed. No retries permitted until 2026-04-17 09:11:18.150307032 +0000 UTC m=+3.052364390 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea2ee429-d7fa-4703-99bd-5d963ebab30c-metrics-certs") pod "network-metrics-daemon-4h6v9" (UID: "ea2ee429-d7fa-4703-99bd-5d963ebab30c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:11:17.152040 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:11:17.152011 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc58c74c9_b33b_45a4_b98a_2c99ab16bff9.slice/crio-a7fc0349432ffb959101b25c18864ae7f4b8e755c331beba9328e2d50eb151f2 WatchSource:0}: Error finding container a7fc0349432ffb959101b25c18864ae7f4b8e755c331beba9328e2d50eb151f2: Status 404 returned error can't find the container with id a7fc0349432ffb959101b25c18864ae7f4b8e755c331beba9328e2d50eb151f2 Apr 17 09:11:17.251034 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:17.250983 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fxf7t\" (UniqueName: \"kubernetes.io/projected/555d9d60-af04-44d3-b6cc-9af0c1398acd-kube-api-access-fxf7t\") pod \"network-check-target-vptzp\" (UID: \"555d9d60-af04-44d3-b6cc-9af0c1398acd\") " pod="openshift-network-diagnostics/network-check-target-vptzp" Apr 17 09:11:17.251243 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:17.251227 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 09:11:17.251309 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:17.251249 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 09:11:17.251309 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:17.251261 2578 projected.go:194] Error preparing data for projected volume kube-api-access-fxf7t for pod openshift-network-diagnostics/network-check-target-vptzp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:11:17.251412 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:17.251334 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/555d9d60-af04-44d3-b6cc-9af0c1398acd-kube-api-access-fxf7t podName:555d9d60-af04-44d3-b6cc-9af0c1398acd nodeName:}" failed. No retries permitted until 2026-04-17 09:11:18.251317138 +0000 UTC m=+3.153374497 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-fxf7t" (UniqueName: "kubernetes.io/projected/555d9d60-af04-44d3-b6cc-9af0c1398acd-kube-api-access-fxf7t") pod "network-check-target-vptzp" (UID: "555d9d60-af04-44d3-b6cc-9af0c1398acd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:11:17.457194 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:17.457166 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 09:11:17.591958 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:17.581152 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 09:06:16 +0000 UTC" deadline="2028-01-13 00:14:10.601097294 +0000 UTC" Apr 17 09:11:17.591958 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:17.581188 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15255h2m53.019913038s" Apr 17 09:11:17.677582 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:17.677479 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmhtf" event={"ID":"80319b72-ff10-4f90-bb63-b62eef3d559e","Type":"ContainerStarted","Data":"9806b6d129b4419b77f382b8d5673cb9e1a3b622142470edfb4815ae6b6c2c78"} Apr 17 09:11:17.684729 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:17.684696 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" event={"ID":"fdf7670d-1d61-4809-892c-ac96118b27f2","Type":"ContainerStarted","Data":"e6ad74acce5d71f907df2c73392e8d75551f09109371b363da99bf0fedba1391"} Apr 17 09:11:17.697885 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:17.697709 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7lzx7" event={"ID":"8e70ed4d-e5a1-4a10-931b-32fe40414a5a","Type":"ContainerStarted","Data":"16e5b4df3fe71d6438a1d1294019e61ad6f95951e6cf0002bfd642e5bded0f32"} Apr 17 09:11:17.713351 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:17.713321 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7hzg2" event={"ID":"e6550303-873c-4278-9d7c-1b6d17d5f9eb","Type":"ContainerStarted","Data":"1b81a70a18b306a3032825dd124cff795420f406a396a4b2a60b9aa9c8584857"} Apr 17 09:11:17.724576 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:17.724520 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mwm8s" event={"ID":"054fc5ee-b86e-42a4-85c2-322e7ca088cf","Type":"ContainerStarted","Data":"8c93475d825bb2e939e8696b1a703da0bbea9e8dcf84bcdb0669e6dc1f6c59cf"} Apr 17 09:11:17.736406 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:17.736285 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-tm6nr" event={"ID":"c58c74c9-b33b-45a4-b98a-2c99ab16bff9","Type":"ContainerStarted","Data":"a7fc0349432ffb959101b25c18864ae7f4b8e755c331beba9328e2d50eb151f2"} Apr 17 09:11:17.750308 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:17.750057 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-fffcg" event={"ID":"1f61bc12-5108-467a-9c7c-fc6b4db52b69","Type":"ContainerStarted","Data":"52ce9e38761df9fe336282f364d67c0e011eb3570582487705218275ebed7139"} Apr 17 09:11:17.763445 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:17.763416 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4p8q2" event={"ID":"164bdfae-ab57-4679-8440-11f5f905aca9","Type":"ContainerStarted","Data":"4b77176c6bde43dd31a73a35b9b7fc506bc2037d5bad1d65d1f5641353b53972"} Apr 17 09:11:17.781224 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:17.781179 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-ht9mz" event={"ID":"5f2e0357-0335-4771-8c1a-7da849e626c2","Type":"ContainerStarted","Data":"ff136118458e40ffa891475d4d9a5d166fb77697db93140852576857222c400f"} Apr 17 09:11:17.799498 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:17.799467 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-147.ec2.internal" event={"ID":"a8447471fcb8cd989806ca435d93d03c","Type":"ContainerStarted","Data":"cf8aa925376089e66cf12fed414bfd68824eb25c4275fa54bb65f7fcb96636e7"} Apr 17 09:11:17.818106 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:17.818073 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-147.ec2.internal" event={"ID":"2e42dd9ccf4c8146ae96ed62e2dab724","Type":"ContainerStarted","Data":"0058f48927077c0af13e8648a7de2311dbbc6146f04e9f4e8e18348302f683ba"} Apr 17 09:11:17.856935 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:17.856795 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 09:11:18.158045 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:18.158004 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea2ee429-d7fa-4703-99bd-5d963ebab30c-metrics-certs\") pod \"network-metrics-daemon-4h6v9\" (UID: \"ea2ee429-d7fa-4703-99bd-5d963ebab30c\") " pod="openshift-multus/network-metrics-daemon-4h6v9" Apr 17 09:11:18.158322 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:18.158179 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:11:18.158322 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:18.158248 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea2ee429-d7fa-4703-99bd-5d963ebab30c-metrics-certs podName:ea2ee429-d7fa-4703-99bd-5d963ebab30c nodeName:}" failed. No retries permitted until 2026-04-17 09:11:20.158226784 +0000 UTC m=+5.060284160 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea2ee429-d7fa-4703-99bd-5d963ebab30c-metrics-certs") pod "network-metrics-daemon-4h6v9" (UID: "ea2ee429-d7fa-4703-99bd-5d963ebab30c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:11:18.167051 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:18.167023 2578 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 09:11:18.258661 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:18.258614 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fxf7t\" (UniqueName: \"kubernetes.io/projected/555d9d60-af04-44d3-b6cc-9af0c1398acd-kube-api-access-fxf7t\") pod \"network-check-target-vptzp\" (UID: \"555d9d60-af04-44d3-b6cc-9af0c1398acd\") " pod="openshift-network-diagnostics/network-check-target-vptzp" Apr 17 09:11:18.258854 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:18.258799 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 09:11:18.258854 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:18.258823 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 09:11:18.258854 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:18.258855 2578 projected.go:194] Error preparing data for projected volume kube-api-access-fxf7t for pod openshift-network-diagnostics/network-check-target-vptzp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:11:18.259023 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:18.258918 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/555d9d60-af04-44d3-b6cc-9af0c1398acd-kube-api-access-fxf7t podName:555d9d60-af04-44d3-b6cc-9af0c1398acd nodeName:}" failed. No retries permitted until 2026-04-17 09:11:20.258898592 +0000 UTC m=+5.160955952 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-fxf7t" (UniqueName: "kubernetes.io/projected/555d9d60-af04-44d3-b6cc-9af0c1398acd-kube-api-access-fxf7t") pod "network-check-target-vptzp" (UID: "555d9d60-af04-44d3-b6cc-9af0c1398acd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:11:18.582085 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:18.582017 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 09:06:16 +0000 UTC" deadline="2027-11-22 04:05:28.040291405 +0000 UTC" Apr 17 09:11:18.582085 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:18.582083 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14010h54m9.458212348s" Apr 17 09:11:18.654960 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:18.654927 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4h6v9" Apr 17 09:11:18.655197 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:18.655063 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4h6v9" podUID="ea2ee429-d7fa-4703-99bd-5d963ebab30c" Apr 17 09:11:18.655578 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:18.655557 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vptzp" Apr 17 09:11:18.655690 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:18.655649 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vptzp" podUID="555d9d60-af04-44d3-b6cc-9af0c1398acd" Apr 17 09:11:20.173430 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:20.173389 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea2ee429-d7fa-4703-99bd-5d963ebab30c-metrics-certs\") pod \"network-metrics-daemon-4h6v9\" (UID: \"ea2ee429-d7fa-4703-99bd-5d963ebab30c\") " pod="openshift-multus/network-metrics-daemon-4h6v9" Apr 17 09:11:20.173925 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:20.173538 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:11:20.173925 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:20.173602 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea2ee429-d7fa-4703-99bd-5d963ebab30c-metrics-certs podName:ea2ee429-d7fa-4703-99bd-5d963ebab30c nodeName:}" failed. No retries permitted until 2026-04-17 09:11:24.173582555 +0000 UTC m=+9.075639917 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea2ee429-d7fa-4703-99bd-5d963ebab30c-metrics-certs") pod "network-metrics-daemon-4h6v9" (UID: "ea2ee429-d7fa-4703-99bd-5d963ebab30c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:11:20.274452 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:20.274413 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fxf7t\" (UniqueName: \"kubernetes.io/projected/555d9d60-af04-44d3-b6cc-9af0c1398acd-kube-api-access-fxf7t\") pod \"network-check-target-vptzp\" (UID: \"555d9d60-af04-44d3-b6cc-9af0c1398acd\") " pod="openshift-network-diagnostics/network-check-target-vptzp" Apr 17 09:11:20.274641 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:20.274613 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 09:11:20.274641 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:20.274632 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 09:11:20.274806 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:20.274644 2578 projected.go:194] Error preparing data for projected volume kube-api-access-fxf7t for pod openshift-network-diagnostics/network-check-target-vptzp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:11:20.274806 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:20.274705 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/555d9d60-af04-44d3-b6cc-9af0c1398acd-kube-api-access-fxf7t podName:555d9d60-af04-44d3-b6cc-9af0c1398acd nodeName:}" failed. No retries permitted until 2026-04-17 09:11:24.274686352 +0000 UTC m=+9.176743718 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-fxf7t" (UniqueName: "kubernetes.io/projected/555d9d60-af04-44d3-b6cc-9af0c1398acd-kube-api-access-fxf7t") pod "network-check-target-vptzp" (UID: "555d9d60-af04-44d3-b6cc-9af0c1398acd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:11:20.655446 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:20.655411 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4h6v9" Apr 17 09:11:20.655631 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:20.655426 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vptzp" Apr 17 09:11:20.655631 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:20.655549 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4h6v9" podUID="ea2ee429-d7fa-4703-99bd-5d963ebab30c" Apr 17 09:11:20.655631 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:20.655618 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vptzp" podUID="555d9d60-af04-44d3-b6cc-9af0c1398acd" Apr 17 09:11:22.655451 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:22.655412 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4h6v9" Apr 17 09:11:22.655451 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:22.655456 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vptzp" Apr 17 09:11:22.655963 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:22.655545 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4h6v9" podUID="ea2ee429-d7fa-4703-99bd-5d963ebab30c" Apr 17 09:11:22.655963 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:22.655652 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vptzp" podUID="555d9d60-af04-44d3-b6cc-9af0c1398acd" Apr 17 09:11:24.208439 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:24.207868 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea2ee429-d7fa-4703-99bd-5d963ebab30c-metrics-certs\") pod \"network-metrics-daemon-4h6v9\" (UID: \"ea2ee429-d7fa-4703-99bd-5d963ebab30c\") " pod="openshift-multus/network-metrics-daemon-4h6v9" Apr 17 09:11:24.208439 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:24.208050 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:11:24.208439 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:24.208112 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea2ee429-d7fa-4703-99bd-5d963ebab30c-metrics-certs podName:ea2ee429-d7fa-4703-99bd-5d963ebab30c nodeName:}" failed. No retries permitted until 2026-04-17 09:11:32.208092885 +0000 UTC m=+17.110150259 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea2ee429-d7fa-4703-99bd-5d963ebab30c-metrics-certs") pod "network-metrics-daemon-4h6v9" (UID: "ea2ee429-d7fa-4703-99bd-5d963ebab30c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:11:24.309141 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:24.309103 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fxf7t\" (UniqueName: \"kubernetes.io/projected/555d9d60-af04-44d3-b6cc-9af0c1398acd-kube-api-access-fxf7t\") pod \"network-check-target-vptzp\" (UID: \"555d9d60-af04-44d3-b6cc-9af0c1398acd\") " pod="openshift-network-diagnostics/network-check-target-vptzp" Apr 17 09:11:24.309319 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:24.309298 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 09:11:24.309319 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:24.309317 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 09:11:24.309426 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:24.309330 2578 projected.go:194] Error preparing data for projected volume kube-api-access-fxf7t for pod openshift-network-diagnostics/network-check-target-vptzp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:11:24.309426 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:24.309383 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/555d9d60-af04-44d3-b6cc-9af0c1398acd-kube-api-access-fxf7t podName:555d9d60-af04-44d3-b6cc-9af0c1398acd nodeName:}" failed. No retries permitted until 2026-04-17 09:11:32.309365309 +0000 UTC m=+17.211422677 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-fxf7t" (UniqueName: "kubernetes.io/projected/555d9d60-af04-44d3-b6cc-9af0c1398acd-kube-api-access-fxf7t") pod "network-check-target-vptzp" (UID: "555d9d60-af04-44d3-b6cc-9af0c1398acd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:11:24.655471 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:24.655436 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4h6v9" Apr 17 09:11:24.655665 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:24.655435 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vptzp" Apr 17 09:11:24.655665 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:24.655590 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4h6v9" podUID="ea2ee429-d7fa-4703-99bd-5d963ebab30c" Apr 17 09:11:24.655665 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:24.655620 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vptzp" podUID="555d9d60-af04-44d3-b6cc-9af0c1398acd" Apr 17 09:11:26.654877 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:26.654830 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4h6v9" Apr 17 09:11:26.654877 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:26.654855 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vptzp" Apr 17 09:11:26.655349 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:26.654965 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4h6v9" podUID="ea2ee429-d7fa-4703-99bd-5d963ebab30c" Apr 17 09:11:26.655349 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:26.655088 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vptzp" podUID="555d9d60-af04-44d3-b6cc-9af0c1398acd" Apr 17 09:11:28.654873 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:28.654831 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4h6v9" Apr 17 09:11:28.655293 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:28.654853 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vptzp" Apr 17 09:11:28.655293 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:28.654947 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4h6v9" podUID="ea2ee429-d7fa-4703-99bd-5d963ebab30c" Apr 17 09:11:28.655293 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:28.655035 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vptzp" podUID="555d9d60-af04-44d3-b6cc-9af0c1398acd" Apr 17 09:11:30.655385 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:30.655346 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4h6v9" Apr 17 09:11:30.655385 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:30.655372 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vptzp" Apr 17 09:11:30.655853 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:30.655481 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4h6v9" podUID="ea2ee429-d7fa-4703-99bd-5d963ebab30c" Apr 17 09:11:30.655853 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:30.655610 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vptzp" podUID="555d9d60-af04-44d3-b6cc-9af0c1398acd" Apr 17 09:11:32.267339 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:32.267301 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea2ee429-d7fa-4703-99bd-5d963ebab30c-metrics-certs\") pod \"network-metrics-daemon-4h6v9\" (UID: \"ea2ee429-d7fa-4703-99bd-5d963ebab30c\") " pod="openshift-multus/network-metrics-daemon-4h6v9" Apr 17 09:11:32.267754 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:32.267462 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:11:32.267754 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:32.267534 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea2ee429-d7fa-4703-99bd-5d963ebab30c-metrics-certs podName:ea2ee429-d7fa-4703-99bd-5d963ebab30c nodeName:}" failed. No retries permitted until 2026-04-17 09:11:48.267516133 +0000 UTC m=+33.169573514 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea2ee429-d7fa-4703-99bd-5d963ebab30c-metrics-certs") pod "network-metrics-daemon-4h6v9" (UID: "ea2ee429-d7fa-4703-99bd-5d963ebab30c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:11:32.368591 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:32.368563 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fxf7t\" (UniqueName: \"kubernetes.io/projected/555d9d60-af04-44d3-b6cc-9af0c1398acd-kube-api-access-fxf7t\") pod \"network-check-target-vptzp\" (UID: \"555d9d60-af04-44d3-b6cc-9af0c1398acd\") " pod="openshift-network-diagnostics/network-check-target-vptzp" Apr 17 09:11:32.368888 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:32.368720 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 09:11:32.368888 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:32.368743 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 09:11:32.368888 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:32.368754 2578 projected.go:194] Error preparing data for projected volume kube-api-access-fxf7t for pod openshift-network-diagnostics/network-check-target-vptzp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:11:32.368888 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:32.368807 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/555d9d60-af04-44d3-b6cc-9af0c1398acd-kube-api-access-fxf7t podName:555d9d60-af04-44d3-b6cc-9af0c1398acd nodeName:}" failed. No retries permitted until 2026-04-17 09:11:48.368789666 +0000 UTC m=+33.270847037 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-fxf7t" (UniqueName: "kubernetes.io/projected/555d9d60-af04-44d3-b6cc-9af0c1398acd-kube-api-access-fxf7t") pod "network-check-target-vptzp" (UID: "555d9d60-af04-44d3-b6cc-9af0c1398acd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:11:32.655061 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:32.654976 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vptzp" Apr 17 09:11:32.655232 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:32.654976 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4h6v9" Apr 17 09:11:32.655232 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:32.655103 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vptzp" podUID="555d9d60-af04-44d3-b6cc-9af0c1398acd" Apr 17 09:11:32.655356 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:32.655225 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4h6v9" podUID="ea2ee429-d7fa-4703-99bd-5d963ebab30c" Apr 17 09:11:34.655495 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:34.655463 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vptzp" Apr 17 09:11:34.655944 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:34.655475 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4h6v9" Apr 17 09:11:34.655944 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:34.655566 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vptzp" podUID="555d9d60-af04-44d3-b6cc-9af0c1398acd" Apr 17 09:11:34.655944 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:34.655653 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4h6v9" podUID="ea2ee429-d7fa-4703-99bd-5d963ebab30c" Apr 17 09:11:35.850554 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:35.850124 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-fffcg" event={"ID":"1f61bc12-5108-467a-9c7c-fc6b4db52b69","Type":"ContainerStarted","Data":"ccfbd5680a7a8c454c46571bae4701bb260af079ca6139d8d039b517f0e3deef"} Apr 17 09:11:35.852237 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:35.852214 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4p8q2" event={"ID":"164bdfae-ab57-4679-8440-11f5f905aca9","Type":"ContainerStarted","Data":"e059c01e6657fc8846b28319bf2bd70958b6d568e1a84162b56cae634c22cd91"} Apr 17 09:11:35.854283 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:35.854257 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-147.ec2.internal" event={"ID":"2e42dd9ccf4c8146ae96ed62e2dab724","Type":"ContainerStarted","Data":"ffa1f098c44359fd6e2785c5d41e984371614a5c24dae2e26ef569f3f4a27de1"} Apr 17 09:11:35.859388 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:35.859365 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" event={"ID":"fdf7670d-1d61-4809-892c-ac96118b27f2","Type":"ContainerStarted","Data":"975e399187ad249e64cfd2bff3c0fe965f9cac575ec1c813f580c093904564c5"} Apr 17 09:11:35.859388 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:35.859392 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" event={"ID":"fdf7670d-1d61-4809-892c-ac96118b27f2","Type":"ContainerStarted","Data":"5f2114a9c0b077c6df88384fea53e0017f2740e8116d1223e169a2d297cbcc16"} Apr 17 09:11:35.859554 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:35.859419 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" event={"ID":"fdf7670d-1d61-4809-892c-ac96118b27f2","Type":"ContainerStarted","Data":"10f168af4e285d5ed639e489ea919b9e31382b569393e60115f690534e25638a"} Apr 17 09:11:35.859554 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:35.859431 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" event={"ID":"fdf7670d-1d61-4809-892c-ac96118b27f2","Type":"ContainerStarted","Data":"d9287bc1220580ce157fdcbea91bda60887494df812821e90e26d85c04fd0235"} Apr 17 09:11:35.859554 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:35.859443 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" event={"ID":"fdf7670d-1d61-4809-892c-ac96118b27f2","Type":"ContainerStarted","Data":"5093d8c861cdaf4ef8ea656d88a643a57b7883b07f6f5fed00db23ccd869b031"} Apr 17 09:11:35.859554 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:35.859454 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" event={"ID":"fdf7670d-1d61-4809-892c-ac96118b27f2","Type":"ContainerStarted","Data":"e679f2538c6e6fb1ac107e638aeb6cded93a03c6876e0a7fc0472e1f999b1701"} Apr 17 09:11:35.867545 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:35.867489 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-fffcg" podStartSLOduration=2.66302363 podStartE2EDuration="20.867474547s" podCreationTimestamp="2026-04-17 09:11:15 +0000 UTC" firstStartedPulling="2026-04-17 09:11:16.931117848 +0000 UTC m=+1.833175212" lastFinishedPulling="2026-04-17 09:11:35.13556877 +0000 UTC m=+20.037626129" observedRunningTime="2026-04-17 09:11:35.866452391 +0000 UTC m=+20.768509773" watchObservedRunningTime="2026-04-17 09:11:35.867474547 +0000 UTC m=+20.769531928" Apr 17 09:11:35.880488 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:35.880455 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-147.ec2.internal" podStartSLOduration=19.880444108 podStartE2EDuration="19.880444108s" podCreationTimestamp="2026-04-17 09:11:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 09:11:35.880343874 +0000 UTC m=+20.782401254" watchObservedRunningTime="2026-04-17 09:11:35.880444108 +0000 UTC m=+20.782501489" Apr 17 09:11:35.901639 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:35.901602 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-4p8q2" podStartSLOduration=2.628404478 podStartE2EDuration="20.90159063s" podCreationTimestamp="2026-04-17 09:11:15 +0000 UTC" firstStartedPulling="2026-04-17 09:11:16.893199767 +0000 UTC m=+1.795257133" lastFinishedPulling="2026-04-17 09:11:35.166385916 +0000 UTC m=+20.068443285" observedRunningTime="2026-04-17 09:11:35.90149631 +0000 UTC m=+20.803553701" watchObservedRunningTime="2026-04-17 09:11:35.90159063 +0000 UTC m=+20.803648011" Apr 17 09:11:36.655551 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:36.655387 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4h6v9" Apr 17 09:11:36.655707 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:36.655387 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vptzp" Apr 17 09:11:36.655707 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:36.655650 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4h6v9" podUID="ea2ee429-d7fa-4703-99bd-5d963ebab30c" Apr 17 09:11:36.655707 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:36.655699 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vptzp" podUID="555d9d60-af04-44d3-b6cc-9af0c1398acd" Apr 17 09:11:36.863079 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:36.863035 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-tm6nr" event={"ID":"c58c74c9-b33b-45a4-b98a-2c99ab16bff9","Type":"ContainerStarted","Data":"624c1716841be7573f2fcce97f6779bf24dab7563d7aad57e06eb22b6d771ddb"} Apr 17 09:11:36.864611 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:36.864581 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-ht9mz" event={"ID":"5f2e0357-0335-4771-8c1a-7da849e626c2","Type":"ContainerStarted","Data":"19b287f087fdb1553870b6797ce6648e11af23ae14d3969ef74dc748b62eba96"} Apr 17 09:11:36.868099 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:36.868074 2578 generic.go:358] "Generic (PLEG): container finished" podID="a8447471fcb8cd989806ca435d93d03c" containerID="c4b199047bad8b403fc414096f02bb4c22ea34d615165320c6c403b778f9b236" exitCode=0 Apr 17 09:11:36.868220 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:36.868136 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-147.ec2.internal" event={"ID":"a8447471fcb8cd989806ca435d93d03c","Type":"ContainerDied","Data":"c4b199047bad8b403fc414096f02bb4c22ea34d615165320c6c403b778f9b236"} Apr 17 09:11:36.869743 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:36.869718 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmhtf" event={"ID":"80319b72-ff10-4f90-bb63-b62eef3d559e","Type":"ContainerStarted","Data":"98c1a9ddbc4a93af3e26457723c30eb7ccd890c1ebd78681f0d3c6db155d7746"} Apr 17 09:11:36.871066 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:36.871041 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7lzx7" event={"ID":"8e70ed4d-e5a1-4a10-931b-32fe40414a5a","Type":"ContainerStarted","Data":"9764cf3425da8d53b6ec62fa8919cc685d6f3e201d5b5eab804a7c1abbaf3d85"} Apr 17 09:11:36.872370 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:36.872346 2578 generic.go:358] "Generic (PLEG): container finished" podID="e6550303-873c-4278-9d7c-1b6d17d5f9eb" containerID="e6e70c07cc02e6f0e03d4b2f38c9c13c550864dd4020042e81cf6ed3879c5ab0" exitCode=0 Apr 17 09:11:36.872468 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:36.872413 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7hzg2" event={"ID":"e6550303-873c-4278-9d7c-1b6d17d5f9eb","Type":"ContainerDied","Data":"e6e70c07cc02e6f0e03d4b2f38c9c13c550864dd4020042e81cf6ed3879c5ab0"} Apr 17 09:11:36.873768 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:36.873718 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mwm8s" event={"ID":"054fc5ee-b86e-42a4-85c2-322e7ca088cf","Type":"ContainerStarted","Data":"14e445302cde570b97d3e3140312c0d2c7c6b6ba651e5af5fc4d07d2ddafa518"} Apr 17 09:11:36.878025 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:36.877985 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-tm6nr" podStartSLOduration=3.89912472 podStartE2EDuration="21.877972631s" podCreationTimestamp="2026-04-17 09:11:15 +0000 UTC" firstStartedPulling="2026-04-17 09:11:17.154004132 +0000 UTC m=+2.056061498" lastFinishedPulling="2026-04-17 09:11:35.132852033 +0000 UTC m=+20.034909409" observedRunningTime="2026-04-17 09:11:36.877931984 +0000 UTC m=+21.779989366" watchObservedRunningTime="2026-04-17 09:11:36.877972631 +0000 UTC m=+21.780030026" Apr 17 09:11:36.891849 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:36.891801 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-mwm8s" podStartSLOduration=3.643374958 podStartE2EDuration="21.891783403s" podCreationTimestamp="2026-04-17 09:11:15 +0000 UTC" firstStartedPulling="2026-04-17 09:11:16.884813076 +0000 UTC m=+1.786870434" lastFinishedPulling="2026-04-17 09:11:35.133221505 +0000 UTC m=+20.035278879" observedRunningTime="2026-04-17 09:11:36.891441924 +0000 UTC m=+21.793499311" watchObservedRunningTime="2026-04-17 09:11:36.891783403 +0000 UTC m=+21.793840819" Apr 17 09:11:36.931958 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:36.931923 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-7lzx7" podStartSLOduration=3.764895484 podStartE2EDuration="21.931909395s" podCreationTimestamp="2026-04-17 09:11:15 +0000 UTC" firstStartedPulling="2026-04-17 09:11:16.950545902 +0000 UTC m=+1.852603265" lastFinishedPulling="2026-04-17 09:11:35.117559815 +0000 UTC m=+20.019617176" observedRunningTime="2026-04-17 09:11:36.931868738 +0000 UTC m=+21.833926118" watchObservedRunningTime="2026-04-17 09:11:36.931909395 +0000 UTC m=+21.833966780" Apr 17 09:11:36.965494 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:36.965451 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-ht9mz" podStartSLOduration=3.730866207 podStartE2EDuration="21.965439687s" podCreationTimestamp="2026-04-17 09:11:15 +0000 UTC" firstStartedPulling="2026-04-17 09:11:16.866050263 +0000 UTC m=+1.768107621" lastFinishedPulling="2026-04-17 09:11:35.100623732 +0000 UTC m=+20.002681101" observedRunningTime="2026-04-17 09:11:36.965219647 +0000 UTC m=+21.867277027" watchObservedRunningTime="2026-04-17 09:11:36.965439687 +0000 UTC m=+21.867497067" Apr 17 09:11:37.138336 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:37.138304 2578 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 09:11:37.603157 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:37.603038 2578 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T09:11:37.138321945Z","UUID":"38f78627-e0e5-4088-a924-372b3eef3c43","Handler":null,"Name":"","Endpoint":""} Apr 17 09:11:37.606399 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:37.606373 2578 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 09:11:37.606531 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:37.606405 2578 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 09:11:37.877944 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:37.877902 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-147.ec2.internal" event={"ID":"a8447471fcb8cd989806ca435d93d03c","Type":"ContainerStarted","Data":"f652e9736ce5c72462ba7da49453e7759c2f08785375ba63f178de5c9d7ed1ca"} Apr 17 09:11:37.880483 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:37.880386 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmhtf" event={"ID":"80319b72-ff10-4f90-bb63-b62eef3d559e","Type":"ContainerStarted","Data":"fe8b3c1008c3aca6bcf17959832d6c33844680bf027896a9da1e8c1cad310b94"} Apr 17 09:11:37.894796 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:37.894745 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-147.ec2.internal" podStartSLOduration=21.894729504 podStartE2EDuration="21.894729504s" podCreationTimestamp="2026-04-17 09:11:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 09:11:37.894158369 +0000 UTC m=+22.796215749" watchObservedRunningTime="2026-04-17 09:11:37.894729504 +0000 UTC m=+22.796786886" Apr 17 09:11:37.896086 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:37.896061 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-ht9mz" Apr 17 09:11:37.896665 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:37.896643 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-ht9mz" Apr 17 09:11:38.655462 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:38.655431 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vptzp" Apr 17 09:11:38.655638 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:38.655430 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4h6v9" Apr 17 09:11:38.655638 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:38.655541 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vptzp" podUID="555d9d60-af04-44d3-b6cc-9af0c1398acd" Apr 17 09:11:38.655748 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:38.655669 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4h6v9" podUID="ea2ee429-d7fa-4703-99bd-5d963ebab30c" Apr 17 09:11:38.885486 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:38.885448 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" event={"ID":"fdf7670d-1d61-4809-892c-ac96118b27f2","Type":"ContainerStarted","Data":"20de1e1477bf18c7ca79f70c18fda346cd78c90453fea982e407d3d3cc8dfbef"} Apr 17 09:11:38.887534 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:38.887504 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmhtf" event={"ID":"80319b72-ff10-4f90-bb63-b62eef3d559e","Type":"ContainerStarted","Data":"25e1ad1b13313e7ada5f9788e1ca7c3e9f332091725efe699671b4788a1fcc02"} Apr 17 09:11:38.888163 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:38.888137 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-ht9mz" Apr 17 09:11:38.888525 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:38.888505 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-ht9mz" Apr 17 09:11:38.918467 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:38.918393 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dmhtf" podStartSLOduration=2.909049718 podStartE2EDuration="23.918380028s" podCreationTimestamp="2026-04-17 09:11:15 +0000 UTC" firstStartedPulling="2026-04-17 09:11:16.989680106 +0000 UTC m=+1.891737469" lastFinishedPulling="2026-04-17 09:11:37.99901041 +0000 UTC m=+22.901067779" observedRunningTime="2026-04-17 09:11:38.904738458 +0000 UTC m=+23.806795851" watchObservedRunningTime="2026-04-17 09:11:38.918380028 +0000 UTC m=+23.820437444" Apr 17 09:11:40.655030 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:40.654863 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vptzp" Apr 17 09:11:40.655411 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:40.654885 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4h6v9" Apr 17 09:11:40.655411 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:40.655111 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vptzp" podUID="555d9d60-af04-44d3-b6cc-9af0c1398acd" Apr 17 09:11:40.655411 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:40.655206 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4h6v9" podUID="ea2ee429-d7fa-4703-99bd-5d963ebab30c" Apr 17 09:11:41.896419 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:41.896328 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" event={"ID":"fdf7670d-1d61-4809-892c-ac96118b27f2","Type":"ContainerStarted","Data":"6b66e5774eb5b71a4e93e4c77323f9373838a8cf154eae8aceeedf0a2d4c4712"} Apr 17 09:11:41.897267 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:41.896639 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:41.897887 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:41.897867 2578 generic.go:358] "Generic (PLEG): container finished" podID="e6550303-873c-4278-9d7c-1b6d17d5f9eb" containerID="2fd8eaa9d30e3aa7f5901592187b2e94cc61dc5ceeb685dac3aeb0a492c166ca" exitCode=0 Apr 17 09:11:41.897966 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:41.897899 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7hzg2" event={"ID":"e6550303-873c-4278-9d7c-1b6d17d5f9eb","Type":"ContainerDied","Data":"2fd8eaa9d30e3aa7f5901592187b2e94cc61dc5ceeb685dac3aeb0a492c166ca"} Apr 17 09:11:41.911585 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:41.911560 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:41.924911 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:41.924871 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" podStartSLOduration=8.529066603 podStartE2EDuration="26.924860111s" podCreationTimestamp="2026-04-17 09:11:15 +0000 UTC" firstStartedPulling="2026-04-17 09:11:16.985338246 +0000 UTC m=+1.887395605" lastFinishedPulling="2026-04-17 09:11:35.381131741 +0000 UTC m=+20.283189113" observedRunningTime="2026-04-17 09:11:41.923341282 +0000 UTC m=+26.825398663" watchObservedRunningTime="2026-04-17 09:11:41.924860111 +0000 UTC m=+26.826917488" Apr 17 09:11:42.655236 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:42.655201 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vptzp" Apr 17 09:11:42.655414 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:42.655206 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4h6v9" Apr 17 09:11:42.655414 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:42.655307 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vptzp" podUID="555d9d60-af04-44d3-b6cc-9af0c1398acd" Apr 17 09:11:42.655414 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:42.655376 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4h6v9" podUID="ea2ee429-d7fa-4703-99bd-5d963ebab30c" Apr 17 09:11:42.902466 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:42.902290 2578 generic.go:358] "Generic (PLEG): container finished" podID="e6550303-873c-4278-9d7c-1b6d17d5f9eb" containerID="ef79f523a4a6b5f1cf08330560e55db92590509dbe100d07104e95214841fb1d" exitCode=0 Apr 17 09:11:42.902819 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:42.902372 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7hzg2" event={"ID":"e6550303-873c-4278-9d7c-1b6d17d5f9eb","Type":"ContainerDied","Data":"ef79f523a4a6b5f1cf08330560e55db92590509dbe100d07104e95214841fb1d"} Apr 17 09:11:42.903338 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:42.903218 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:42.903338 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:42.903245 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:42.918324 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:42.918254 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:11:43.162330 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:43.162302 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4h6v9"] Apr 17 09:11:43.162495 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:43.162433 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4h6v9" Apr 17 09:11:43.162598 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:43.162573 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4h6v9" podUID="ea2ee429-d7fa-4703-99bd-5d963ebab30c" Apr 17 09:11:43.163043 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:43.163013 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-vptzp"] Apr 17 09:11:43.163141 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:43.163106 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vptzp" Apr 17 09:11:43.163227 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:43.163177 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vptzp" podUID="555d9d60-af04-44d3-b6cc-9af0c1398acd" Apr 17 09:11:43.906349 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:43.906270 2578 generic.go:358] "Generic (PLEG): container finished" podID="e6550303-873c-4278-9d7c-1b6d17d5f9eb" containerID="d49920e9c252757f91ec31a586b0f0ae4d42163306148594f08f5c514351df25" exitCode=0 Apr 17 09:11:43.906683 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:43.906352 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7hzg2" event={"ID":"e6550303-873c-4278-9d7c-1b6d17d5f9eb","Type":"ContainerDied","Data":"d49920e9c252757f91ec31a586b0f0ae4d42163306148594f08f5c514351df25"} Apr 17 09:11:44.655257 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:44.655172 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4h6v9" Apr 17 09:11:44.655414 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:44.655172 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vptzp" Apr 17 09:11:44.655414 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:44.655329 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4h6v9" podUID="ea2ee429-d7fa-4703-99bd-5d963ebab30c" Apr 17 09:11:44.655682 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:44.655422 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vptzp" podUID="555d9d60-af04-44d3-b6cc-9af0c1398acd" Apr 17 09:11:46.655022 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:46.654990 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4h6v9" Apr 17 09:11:46.655477 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:46.654997 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vptzp" Apr 17 09:11:46.655477 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:46.655103 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4h6v9" podUID="ea2ee429-d7fa-4703-99bd-5d963ebab30c" Apr 17 09:11:46.655477 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:46.655181 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vptzp" podUID="555d9d60-af04-44d3-b6cc-9af0c1398acd" Apr 17 09:11:48.293063 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:48.293027 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea2ee429-d7fa-4703-99bd-5d963ebab30c-metrics-certs\") pod \"network-metrics-daemon-4h6v9\" (UID: \"ea2ee429-d7fa-4703-99bd-5d963ebab30c\") " pod="openshift-multus/network-metrics-daemon-4h6v9" Apr 17 09:11:48.293526 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:48.293165 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:11:48.293526 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:48.293240 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea2ee429-d7fa-4703-99bd-5d963ebab30c-metrics-certs podName:ea2ee429-d7fa-4703-99bd-5d963ebab30c nodeName:}" failed. No retries permitted until 2026-04-17 09:12:20.293222023 +0000 UTC m=+65.195279385 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea2ee429-d7fa-4703-99bd-5d963ebab30c-metrics-certs") pod "network-metrics-daemon-4h6v9" (UID: "ea2ee429-d7fa-4703-99bd-5d963ebab30c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:11:48.393828 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:48.393791 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fxf7t\" (UniqueName: \"kubernetes.io/projected/555d9d60-af04-44d3-b6cc-9af0c1398acd-kube-api-access-fxf7t\") pod \"network-check-target-vptzp\" (UID: \"555d9d60-af04-44d3-b6cc-9af0c1398acd\") " pod="openshift-network-diagnostics/network-check-target-vptzp" Apr 17 09:11:48.394024 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:48.393978 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 09:11:48.394024 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:48.394016 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 09:11:48.394134 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:48.394029 2578 projected.go:194] Error preparing data for projected volume kube-api-access-fxf7t for pod openshift-network-diagnostics/network-check-target-vptzp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:11:48.394134 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:48.394091 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/555d9d60-af04-44d3-b6cc-9af0c1398acd-kube-api-access-fxf7t podName:555d9d60-af04-44d3-b6cc-9af0c1398acd nodeName:}" failed. No retries permitted until 2026-04-17 09:12:20.394069341 +0000 UTC m=+65.296126714 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-fxf7t" (UniqueName: "kubernetes.io/projected/555d9d60-af04-44d3-b6cc-9af0c1398acd-kube-api-access-fxf7t") pod "network-check-target-vptzp" (UID: "555d9d60-af04-44d3-b6cc-9af0c1398acd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:11:48.655274 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:48.655194 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vptzp" Apr 17 09:11:48.655274 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:48.655223 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4h6v9" Apr 17 09:11:48.655506 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:48.655318 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vptzp" podUID="555d9d60-af04-44d3-b6cc-9af0c1398acd" Apr 17 09:11:48.655506 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:48.655455 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4h6v9" podUID="ea2ee429-d7fa-4703-99bd-5d963ebab30c" Apr 17 09:11:48.859074 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:48.859047 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-147.ec2.internal" event="NodeReady" Apr 17 09:11:48.859289 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:48.859225 2578 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 09:11:48.902748 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:48.902719 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-mw8fd"] Apr 17 09:11:48.953147 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:48.953071 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-jmvr9"] Apr 17 09:11:48.953314 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:48.953157 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mw8fd" Apr 17 09:11:48.956726 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:48.956698 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 09:11:48.956980 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:48.956960 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-hv2j9\"" Apr 17 09:11:48.957131 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:48.957119 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 09:11:48.966556 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:48.966538 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mw8fd"] Apr 17 09:11:48.966656 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:48.966563 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jmvr9"] Apr 17 09:11:48.966716 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:48.966665 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jmvr9" Apr 17 09:11:48.969777 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:48.969751 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-hdtpc\"" Apr 17 09:11:48.970034 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:48.970005 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 09:11:48.970125 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:48.970040 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 09:11:48.970183 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:48.970121 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 09:11:49.098359 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:49.098328 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5-tmp-dir\") pod \"dns-default-mw8fd\" (UID: \"b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5\") " pod="openshift-dns/dns-default-mw8fd" Apr 17 09:11:49.098359 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:49.098371 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0becf09-e2a4-4fea-a602-f69826ef0f66-cert\") pod \"ingress-canary-jmvr9\" (UID: \"a0becf09-e2a4-4fea-a602-f69826ef0f66\") " pod="openshift-ingress-canary/ingress-canary-jmvr9" Apr 17 09:11:49.098595 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:49.098425 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2845\" (UniqueName: \"kubernetes.io/projected/a0becf09-e2a4-4fea-a602-f69826ef0f66-kube-api-access-w2845\") pod \"ingress-canary-jmvr9\" (UID: \"a0becf09-e2a4-4fea-a602-f69826ef0f66\") " pod="openshift-ingress-canary/ingress-canary-jmvr9" Apr 17 09:11:49.098595 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:49.098451 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5-metrics-tls\") pod \"dns-default-mw8fd\" (UID: \"b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5\") " pod="openshift-dns/dns-default-mw8fd" Apr 17 09:11:49.098595 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:49.098485 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xh7r\" (UniqueName: \"kubernetes.io/projected/b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5-kube-api-access-6xh7r\") pod \"dns-default-mw8fd\" (UID: \"b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5\") " pod="openshift-dns/dns-default-mw8fd" Apr 17 09:11:49.098595 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:49.098510 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5-config-volume\") pod \"dns-default-mw8fd\" (UID: \"b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5\") " pod="openshift-dns/dns-default-mw8fd" Apr 17 09:11:49.199232 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:49.199185 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5-config-volume\") pod \"dns-default-mw8fd\" (UID: \"b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5\") " pod="openshift-dns/dns-default-mw8fd" Apr 17 09:11:49.199415 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:49.199241 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5-tmp-dir\") pod \"dns-default-mw8fd\" (UID: \"b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5\") " pod="openshift-dns/dns-default-mw8fd" Apr 17 09:11:49.199415 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:49.199271 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0becf09-e2a4-4fea-a602-f69826ef0f66-cert\") pod \"ingress-canary-jmvr9\" (UID: \"a0becf09-e2a4-4fea-a602-f69826ef0f66\") " pod="openshift-ingress-canary/ingress-canary-jmvr9" Apr 17 09:11:49.199568 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:49.199321 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w2845\" (UniqueName: \"kubernetes.io/projected/a0becf09-e2a4-4fea-a602-f69826ef0f66-kube-api-access-w2845\") pod \"ingress-canary-jmvr9\" (UID: \"a0becf09-e2a4-4fea-a602-f69826ef0f66\") " pod="openshift-ingress-canary/ingress-canary-jmvr9" Apr 17 09:11:49.199707 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:49.199616 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5-metrics-tls\") pod \"dns-default-mw8fd\" (UID: \"b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5\") " pod="openshift-dns/dns-default-mw8fd" Apr 17 09:11:49.199707 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:49.199675 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6xh7r\" (UniqueName: \"kubernetes.io/projected/b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5-kube-api-access-6xh7r\") pod \"dns-default-mw8fd\" (UID: \"b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5\") " pod="openshift-dns/dns-default-mw8fd" Apr 17 09:11:49.199991 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:49.199961 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 09:11:49.199991 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:49.199987 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 09:11:49.200178 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:49.200044 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0becf09-e2a4-4fea-a602-f69826ef0f66-cert podName:a0becf09-e2a4-4fea-a602-f69826ef0f66 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:49.700024706 +0000 UTC m=+34.602082084 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a0becf09-e2a4-4fea-a602-f69826ef0f66-cert") pod "ingress-canary-jmvr9" (UID: "a0becf09-e2a4-4fea-a602-f69826ef0f66") : secret "canary-serving-cert" not found Apr 17 09:11:49.200178 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:49.200067 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5-metrics-tls podName:b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:49.700057193 +0000 UTC m=+34.602114557 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5-metrics-tls") pod "dns-default-mw8fd" (UID: "b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5") : secret "dns-default-metrics-tls" not found Apr 17 09:11:49.200296 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:49.200189 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5-config-volume\") pod \"dns-default-mw8fd\" (UID: \"b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5\") " pod="openshift-dns/dns-default-mw8fd" Apr 17 09:11:49.200354 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:49.200336 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5-tmp-dir\") pod \"dns-default-mw8fd\" (UID: \"b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5\") " pod="openshift-dns/dns-default-mw8fd" Apr 17 09:11:49.210492 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:49.210359 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xh7r\" (UniqueName: \"kubernetes.io/projected/b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5-kube-api-access-6xh7r\") pod \"dns-default-mw8fd\" (UID: \"b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5\") " pod="openshift-dns/dns-default-mw8fd" Apr 17 09:11:49.210492 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:49.210476 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2845\" (UniqueName: \"kubernetes.io/projected/a0becf09-e2a4-4fea-a602-f69826ef0f66-kube-api-access-w2845\") pod \"ingress-canary-jmvr9\" (UID: \"a0becf09-e2a4-4fea-a602-f69826ef0f66\") " pod="openshift-ingress-canary/ingress-canary-jmvr9" Apr 17 09:11:49.704430 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:49.704391 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5-metrics-tls\") pod \"dns-default-mw8fd\" (UID: \"b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5\") " pod="openshift-dns/dns-default-mw8fd" Apr 17 09:11:49.704809 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:49.704484 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0becf09-e2a4-4fea-a602-f69826ef0f66-cert\") pod \"ingress-canary-jmvr9\" (UID: \"a0becf09-e2a4-4fea-a602-f69826ef0f66\") " pod="openshift-ingress-canary/ingress-canary-jmvr9" Apr 17 09:11:49.704809 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:49.704525 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 09:11:49.704809 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:49.704591 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5-metrics-tls podName:b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:50.704574813 +0000 UTC m=+35.606632175 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5-metrics-tls") pod "dns-default-mw8fd" (UID: "b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5") : secret "dns-default-metrics-tls" not found Apr 17 09:11:49.704809 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:49.704591 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 09:11:49.704809 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:49.704643 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0becf09-e2a4-4fea-a602-f69826ef0f66-cert podName:a0becf09-e2a4-4fea-a602-f69826ef0f66 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:50.704626324 +0000 UTC m=+35.606683696 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a0becf09-e2a4-4fea-a602-f69826ef0f66-cert") pod "ingress-canary-jmvr9" (UID: "a0becf09-e2a4-4fea-a602-f69826ef0f66") : secret "canary-serving-cert" not found Apr 17 09:11:50.655927 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:50.655856 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vptzp" Apr 17 09:11:50.656080 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:50.655856 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4h6v9" Apr 17 09:11:50.658777 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:50.658761 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 09:11:50.660098 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:50.660081 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-wbqzc\"" Apr 17 09:11:50.660098 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:50.660094 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-7jx4l\"" Apr 17 09:11:50.660195 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:50.660088 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 09:11:50.660195 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:50.660138 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 09:11:50.712415 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:50.712394 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5-metrics-tls\") pod \"dns-default-mw8fd\" (UID: \"b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5\") " pod="openshift-dns/dns-default-mw8fd" Apr 17 09:11:50.712955 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:50.712447 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0becf09-e2a4-4fea-a602-f69826ef0f66-cert\") pod \"ingress-canary-jmvr9\" (UID: \"a0becf09-e2a4-4fea-a602-f69826ef0f66\") " pod="openshift-ingress-canary/ingress-canary-jmvr9" Apr 17 09:11:50.712955 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:50.712531 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 09:11:50.712955 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:50.712532 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 09:11:50.712955 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:50.712585 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5-metrics-tls podName:b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:52.712569994 +0000 UTC m=+37.614627353 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5-metrics-tls") pod "dns-default-mw8fd" (UID: "b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5") : secret "dns-default-metrics-tls" not found Apr 17 09:11:50.712955 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:50.712600 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0becf09-e2a4-4fea-a602-f69826ef0f66-cert podName:a0becf09-e2a4-4fea-a602-f69826ef0f66 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:52.712593477 +0000 UTC m=+37.614650835 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a0becf09-e2a4-4fea-a602-f69826ef0f66-cert") pod "ingress-canary-jmvr9" (UID: "a0becf09-e2a4-4fea-a602-f69826ef0f66") : secret "canary-serving-cert" not found Apr 17 09:11:50.922069 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:50.921993 2578 generic.go:358] "Generic (PLEG): container finished" podID="e6550303-873c-4278-9d7c-1b6d17d5f9eb" containerID="868a60d8257d6211fd2b17b84b81458c5639cf7f565a04c02ae1a5a01b591d0d" exitCode=0 Apr 17 09:11:50.922069 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:50.922040 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7hzg2" event={"ID":"e6550303-873c-4278-9d7c-1b6d17d5f9eb","Type":"ContainerDied","Data":"868a60d8257d6211fd2b17b84b81458c5639cf7f565a04c02ae1a5a01b591d0d"} Apr 17 09:11:51.926118 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:51.926084 2578 generic.go:358] "Generic (PLEG): container finished" podID="e6550303-873c-4278-9d7c-1b6d17d5f9eb" containerID="5f31cac282817a2c9bf63833a9746d770204c9c156108d372b6a999b42387c49" exitCode=0 Apr 17 09:11:51.926526 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:51.926128 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7hzg2" event={"ID":"e6550303-873c-4278-9d7c-1b6d17d5f9eb","Type":"ContainerDied","Data":"5f31cac282817a2c9bf63833a9746d770204c9c156108d372b6a999b42387c49"} Apr 17 09:11:52.725655 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:52.725629 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5-metrics-tls\") pod \"dns-default-mw8fd\" (UID: \"b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5\") " pod="openshift-dns/dns-default-mw8fd" Apr 17 09:11:52.725745 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:52.725686 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0becf09-e2a4-4fea-a602-f69826ef0f66-cert\") pod \"ingress-canary-jmvr9\" (UID: \"a0becf09-e2a4-4fea-a602-f69826ef0f66\") " pod="openshift-ingress-canary/ingress-canary-jmvr9" Apr 17 09:11:52.725787 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:52.725767 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 09:11:52.725851 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:52.725824 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5-metrics-tls podName:b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:56.72580926 +0000 UTC m=+41.627866619 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5-metrics-tls") pod "dns-default-mw8fd" (UID: "b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5") : secret "dns-default-metrics-tls" not found Apr 17 09:11:52.725919 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:52.725776 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 09:11:52.725919 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:52.725890 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0becf09-e2a4-4fea-a602-f69826ef0f66-cert podName:a0becf09-e2a4-4fea-a602-f69826ef0f66 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:56.725878915 +0000 UTC m=+41.627936300 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a0becf09-e2a4-4fea-a602-f69826ef0f66-cert") pod "ingress-canary-jmvr9" (UID: "a0becf09-e2a4-4fea-a602-f69826ef0f66") : secret "canary-serving-cert" not found Apr 17 09:11:52.930613 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:52.930576 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7hzg2" event={"ID":"e6550303-873c-4278-9d7c-1b6d17d5f9eb","Type":"ContainerStarted","Data":"f6af798743f99cc4cd41d5fb35c24457ffe4a0c6776b2a8010d016ec88482031"} Apr 17 09:11:52.954152 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:52.954110 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-7hzg2" podStartSLOduration=4.921538699 podStartE2EDuration="37.954098657s" podCreationTimestamp="2026-04-17 09:11:15 +0000 UTC" firstStartedPulling="2026-04-17 09:11:16.912261416 +0000 UTC m=+1.814318779" lastFinishedPulling="2026-04-17 09:11:49.944821379 +0000 UTC m=+34.846878737" observedRunningTime="2026-04-17 09:11:52.952166106 +0000 UTC m=+37.854223488" watchObservedRunningTime="2026-04-17 09:11:52.954098657 +0000 UTC m=+37.856156038" Apr 17 09:11:56.750371 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:56.750341 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0becf09-e2a4-4fea-a602-f69826ef0f66-cert\") pod \"ingress-canary-jmvr9\" (UID: \"a0becf09-e2a4-4fea-a602-f69826ef0f66\") " pod="openshift-ingress-canary/ingress-canary-jmvr9" Apr 17 09:11:56.750731 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:11:56.750391 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5-metrics-tls\") pod \"dns-default-mw8fd\" (UID: \"b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5\") " pod="openshift-dns/dns-default-mw8fd" Apr 17 09:11:56.750731 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:56.750519 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 09:11:56.750731 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:56.750572 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0becf09-e2a4-4fea-a602-f69826ef0f66-cert podName:a0becf09-e2a4-4fea-a602-f69826ef0f66 nodeName:}" failed. No retries permitted until 2026-04-17 09:12:04.750558368 +0000 UTC m=+49.652615732 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a0becf09-e2a4-4fea-a602-f69826ef0f66-cert") pod "ingress-canary-jmvr9" (UID: "a0becf09-e2a4-4fea-a602-f69826ef0f66") : secret "canary-serving-cert" not found Apr 17 09:11:56.750731 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:56.750519 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 09:11:56.750731 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:11:56.750639 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5-metrics-tls podName:b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5 nodeName:}" failed. No retries permitted until 2026-04-17 09:12:04.750628332 +0000 UTC m=+49.652685694 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5-metrics-tls") pod "dns-default-mw8fd" (UID: "b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5") : secret "dns-default-metrics-tls" not found Apr 17 09:12:04.807437 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:12:04.807401 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5-metrics-tls\") pod \"dns-default-mw8fd\" (UID: \"b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5\") " pod="openshift-dns/dns-default-mw8fd" Apr 17 09:12:04.807969 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:12:04.807474 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0becf09-e2a4-4fea-a602-f69826ef0f66-cert\") pod \"ingress-canary-jmvr9\" (UID: \"a0becf09-e2a4-4fea-a602-f69826ef0f66\") " pod="openshift-ingress-canary/ingress-canary-jmvr9" Apr 17 09:12:04.807969 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:12:04.807549 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 09:12:04.807969 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:12:04.807596 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 09:12:04.807969 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:12:04.807615 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5-metrics-tls podName:b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5 nodeName:}" failed. No retries permitted until 2026-04-17 09:12:20.807597976 +0000 UTC m=+65.709655335 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5-metrics-tls") pod "dns-default-mw8fd" (UID: "b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5") : secret "dns-default-metrics-tls" not found Apr 17 09:12:04.807969 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:12:04.807655 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0becf09-e2a4-4fea-a602-f69826ef0f66-cert podName:a0becf09-e2a4-4fea-a602-f69826ef0f66 nodeName:}" failed. No retries permitted until 2026-04-17 09:12:20.807638142 +0000 UTC m=+65.709695514 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a0becf09-e2a4-4fea-a602-f69826ef0f66-cert") pod "ingress-canary-jmvr9" (UID: "a0becf09-e2a4-4fea-a602-f69826ef0f66") : secret "canary-serving-cert" not found Apr 17 09:12:14.917869 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:12:14.917823 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w5vps" Apr 17 09:12:20.306428 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:12:20.306386 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea2ee429-d7fa-4703-99bd-5d963ebab30c-metrics-certs\") pod \"network-metrics-daemon-4h6v9\" (UID: \"ea2ee429-d7fa-4703-99bd-5d963ebab30c\") " pod="openshift-multus/network-metrics-daemon-4h6v9" Apr 17 09:12:20.309247 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:12:20.309227 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 09:12:20.317544 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:12:20.317522 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 09:12:20.317631 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:12:20.317590 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea2ee429-d7fa-4703-99bd-5d963ebab30c-metrics-certs podName:ea2ee429-d7fa-4703-99bd-5d963ebab30c nodeName:}" failed. No retries permitted until 2026-04-17 09:13:24.317568255 +0000 UTC m=+129.219625628 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea2ee429-d7fa-4703-99bd-5d963ebab30c-metrics-certs") pod "network-metrics-daemon-4h6v9" (UID: "ea2ee429-d7fa-4703-99bd-5d963ebab30c") : secret "metrics-daemon-secret" not found Apr 17 09:12:20.406867 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:12:20.406826 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fxf7t\" (UniqueName: \"kubernetes.io/projected/555d9d60-af04-44d3-b6cc-9af0c1398acd-kube-api-access-fxf7t\") pod \"network-check-target-vptzp\" (UID: \"555d9d60-af04-44d3-b6cc-9af0c1398acd\") " pod="openshift-network-diagnostics/network-check-target-vptzp" Apr 17 09:12:20.409946 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:12:20.409929 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 09:12:20.419795 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:12:20.419776 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 09:12:20.430418 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:12:20.430391 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxf7t\" (UniqueName: \"kubernetes.io/projected/555d9d60-af04-44d3-b6cc-9af0c1398acd-kube-api-access-fxf7t\") pod \"network-check-target-vptzp\" (UID: \"555d9d60-af04-44d3-b6cc-9af0c1398acd\") " pod="openshift-network-diagnostics/network-check-target-vptzp" Apr 17 09:12:20.668675 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:12:20.668607 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-7jx4l\"" Apr 17 09:12:20.675828 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:12:20.675811 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vptzp" Apr 17 09:12:20.809276 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:12:20.809248 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0becf09-e2a4-4fea-a602-f69826ef0f66-cert\") pod \"ingress-canary-jmvr9\" (UID: \"a0becf09-e2a4-4fea-a602-f69826ef0f66\") " pod="openshift-ingress-canary/ingress-canary-jmvr9" Apr 17 09:12:20.809407 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:12:20.809314 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5-metrics-tls\") pod \"dns-default-mw8fd\" (UID: \"b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5\") " pod="openshift-dns/dns-default-mw8fd" Apr 17 09:12:20.809472 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:12:20.809408 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 09:12:20.809472 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:12:20.809418 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 09:12:20.809472 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:12:20.809468 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5-metrics-tls podName:b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5 nodeName:}" failed. No retries permitted until 2026-04-17 09:12:52.809449528 +0000 UTC m=+97.711506890 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5-metrics-tls") pod "dns-default-mw8fd" (UID: "b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5") : secret "dns-default-metrics-tls" not found Apr 17 09:12:20.809585 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:12:20.809483 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0becf09-e2a4-4fea-a602-f69826ef0f66-cert podName:a0becf09-e2a4-4fea-a602-f69826ef0f66 nodeName:}" failed. No retries permitted until 2026-04-17 09:12:52.809475938 +0000 UTC m=+97.711533297 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a0becf09-e2a4-4fea-a602-f69826ef0f66-cert") pod "ingress-canary-jmvr9" (UID: "a0becf09-e2a4-4fea-a602-f69826ef0f66") : secret "canary-serving-cert" not found Apr 17 09:12:20.853273 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:12:20.853168 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-vptzp"] Apr 17 09:12:20.856779 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:12:20.856757 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod555d9d60_af04_44d3_b6cc_9af0c1398acd.slice/crio-78e76788723def133b68b307d64b8afcb187897a53f3ca9839de70ba746a9337 WatchSource:0}: Error finding container 78e76788723def133b68b307d64b8afcb187897a53f3ca9839de70ba746a9337: Status 404 returned error can't find the container with id 78e76788723def133b68b307d64b8afcb187897a53f3ca9839de70ba746a9337 Apr 17 09:12:20.983721 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:12:20.983698 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-vptzp" event={"ID":"555d9d60-af04-44d3-b6cc-9af0c1398acd","Type":"ContainerStarted","Data":"78e76788723def133b68b307d64b8afcb187897a53f3ca9839de70ba746a9337"} Apr 17 09:12:23.993123 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:12:23.993089 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-vptzp" event={"ID":"555d9d60-af04-44d3-b6cc-9af0c1398acd","Type":"ContainerStarted","Data":"154af6c19c23fc52fa5fed52896980e35d65afb202040f8ef0d0111832cb1a1e"} Apr 17 09:12:23.993478 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:12:23.993241 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-vptzp" Apr 17 09:12:24.009499 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:12:24.009454 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-vptzp" podStartSLOduration=66.345792422 podStartE2EDuration="1m9.009441894s" podCreationTimestamp="2026-04-17 09:11:15 +0000 UTC" firstStartedPulling="2026-04-17 09:12:20.858675426 +0000 UTC m=+65.760732789" lastFinishedPulling="2026-04-17 09:12:23.522324902 +0000 UTC m=+68.424382261" observedRunningTime="2026-04-17 09:12:24.009377657 +0000 UTC m=+68.911435037" watchObservedRunningTime="2026-04-17 09:12:24.009441894 +0000 UTC m=+68.911499294" Apr 17 09:12:52.823771 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:12:52.823708 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5-metrics-tls\") pod \"dns-default-mw8fd\" (UID: \"b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5\") " pod="openshift-dns/dns-default-mw8fd" Apr 17 09:12:52.824240 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:12:52.823798 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0becf09-e2a4-4fea-a602-f69826ef0f66-cert\") pod \"ingress-canary-jmvr9\" (UID: \"a0becf09-e2a4-4fea-a602-f69826ef0f66\") " pod="openshift-ingress-canary/ingress-canary-jmvr9" Apr 17 09:12:52.824240 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:12:52.823898 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 09:12:52.824240 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:12:52.823940 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 09:12:52.824240 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:12:52.823975 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5-metrics-tls podName:b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5 nodeName:}" failed. No retries permitted until 2026-04-17 09:13:56.823956942 +0000 UTC m=+161.726014321 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5-metrics-tls") pod "dns-default-mw8fd" (UID: "b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5") : secret "dns-default-metrics-tls" not found Apr 17 09:12:52.824240 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:12:52.824003 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0becf09-e2a4-4fea-a602-f69826ef0f66-cert podName:a0becf09-e2a4-4fea-a602-f69826ef0f66 nodeName:}" failed. No retries permitted until 2026-04-17 09:13:56.823986245 +0000 UTC m=+161.726043604 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a0becf09-e2a4-4fea-a602-f69826ef0f66-cert") pod "ingress-canary-jmvr9" (UID: "a0becf09-e2a4-4fea-a602-f69826ef0f66") : secret "canary-serving-cert" not found Apr 17 09:12:54.997318 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:12:54.997288 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-vptzp" Apr 17 09:13:13.111845 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:13.111799 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-rlbzq"] Apr 17 09:13:13.114430 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:13.114412 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-rlbzq" Apr 17 09:13:13.116930 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:13.116906 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 17 09:13:13.117070 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:13.116987 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 09:13:13.117174 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:13.117159 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 09:13:13.118209 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:13.118190 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 17 09:13:13.118209 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:13.118200 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-4lkq8\"" Apr 17 09:13:13.122709 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:13.122691 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 17 09:13:13.124512 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:13.124491 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-rlbzq"] Apr 17 09:13:13.216563 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:13.216532 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-tkbrp"] Apr 17 09:13:13.219314 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:13.219299 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tkbrp" Apr 17 09:13:13.224893 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:13.224827 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 09:13:13.224994 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:13.224907 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 17 09:13:13.225114 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:13.225089 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-426jd\"" Apr 17 09:13:13.225561 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:13.225545 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 09:13:13.229831 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:13.229816 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 17 09:13:13.252796 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:13.252774 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-tkbrp"] Apr 17 09:13:13.260230 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:13.260210 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhtg9\" (UniqueName: \"kubernetes.io/projected/df06ee4d-da4b-4812-876f-8b39a0419cca-kube-api-access-qhtg9\") pod \"insights-operator-585dfdc468-rlbzq\" (UID: \"df06ee4d-da4b-4812-876f-8b39a0419cca\") " pod="openshift-insights/insights-operator-585dfdc468-rlbzq" Apr 17 09:13:13.260326 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:13.260245 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/df06ee4d-da4b-4812-876f-8b39a0419cca-snapshots\") pod \"insights-operator-585dfdc468-rlbzq\" (UID: \"df06ee4d-da4b-4812-876f-8b39a0419cca\") " pod="openshift-insights/insights-operator-585dfdc468-rlbzq" Apr 17 09:13:13.260326 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:13.260263 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df06ee4d-da4b-4812-876f-8b39a0419cca-serving-cert\") pod \"insights-operator-585dfdc468-rlbzq\" (UID: \"df06ee4d-da4b-4812-876f-8b39a0419cca\") " pod="openshift-insights/insights-operator-585dfdc468-rlbzq" Apr 17 09:13:13.260326 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:13.260312 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df06ee4d-da4b-4812-876f-8b39a0419cca-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-rlbzq\" (UID: \"df06ee4d-da4b-4812-876f-8b39a0419cca\") " pod="openshift-insights/insights-operator-585dfdc468-rlbzq" Apr 17 09:13:13.260440 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:13.260365 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/df06ee4d-da4b-4812-876f-8b39a0419cca-tmp\") pod \"insights-operator-585dfdc468-rlbzq\" (UID: \"df06ee4d-da4b-4812-876f-8b39a0419cca\") " pod="openshift-insights/insights-operator-585dfdc468-rlbzq" Apr 17 09:13:13.260440 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:13.260387 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df06ee4d-da4b-4812-876f-8b39a0419cca-service-ca-bundle\") pod \"insights-operator-585dfdc468-rlbzq\" (UID: \"df06ee4d-da4b-4812-876f-8b39a0419cca\") " pod="openshift-insights/insights-operator-585dfdc468-rlbzq" Apr 17 09:13:13.361206 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:13.361178 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df06ee4d-da4b-4812-876f-8b39a0419cca-service-ca-bundle\") pod \"insights-operator-585dfdc468-rlbzq\" (UID: \"df06ee4d-da4b-4812-876f-8b39a0419cca\") " pod="openshift-insights/insights-operator-585dfdc468-rlbzq" Apr 17 09:13:13.361310 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:13.361237 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qhtg9\" (UniqueName: \"kubernetes.io/projected/df06ee4d-da4b-4812-876f-8b39a0419cca-kube-api-access-qhtg9\") pod \"insights-operator-585dfdc468-rlbzq\" (UID: \"df06ee4d-da4b-4812-876f-8b39a0419cca\") " pod="openshift-insights/insights-operator-585dfdc468-rlbzq" Apr 17 09:13:13.361371 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:13.361352 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4gwv\" (UniqueName: \"kubernetes.io/projected/1e2a7bc9-c500-464d-a161-c668f67f1430-kube-api-access-x4gwv\") pod \"cluster-monitoring-operator-75587bd455-tkbrp\" (UID: \"1e2a7bc9-c500-464d-a161-c668f67f1430\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tkbrp" Apr 17 09:13:13.361418 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:13.361399 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/df06ee4d-da4b-4812-876f-8b39a0419cca-snapshots\") pod \"insights-operator-585dfdc468-rlbzq\" (UID: \"df06ee4d-da4b-4812-876f-8b39a0419cca\") " pod="openshift-insights/insights-operator-585dfdc468-rlbzq" Apr 17 09:13:13.361498 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:13.361419 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df06ee4d-da4b-4812-876f-8b39a0419cca-serving-cert\") pod \"insights-operator-585dfdc468-rlbzq\" (UID: \"df06ee4d-da4b-4812-876f-8b39a0419cca\") " pod="openshift-insights/insights-operator-585dfdc468-rlbzq" Apr 17 09:13:13.361498 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:13.361450 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df06ee4d-da4b-4812-876f-8b39a0419cca-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-rlbzq\" (UID: \"df06ee4d-da4b-4812-876f-8b39a0419cca\") " pod="openshift-insights/insights-operator-585dfdc468-rlbzq" Apr 17 09:13:13.361498 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:13.361477 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/1e2a7bc9-c500-464d-a161-c668f67f1430-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-tkbrp\" (UID: \"1e2a7bc9-c500-464d-a161-c668f67f1430\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tkbrp" Apr 17 09:13:13.361646 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:13.361537 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/1e2a7bc9-c500-464d-a161-c668f67f1430-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tkbrp\" (UID: \"1e2a7bc9-c500-464d-a161-c668f67f1430\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tkbrp" Apr 17 09:13:13.361646 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:13.361572 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/df06ee4d-da4b-4812-876f-8b39a0419cca-tmp\") pod \"insights-operator-585dfdc468-rlbzq\" (UID: \"df06ee4d-da4b-4812-876f-8b39a0419cca\") " pod="openshift-insights/insights-operator-585dfdc468-rlbzq" Apr 17 09:13:13.361789 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:13.361764 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df06ee4d-da4b-4812-876f-8b39a0419cca-service-ca-bundle\") pod \"insights-operator-585dfdc468-rlbzq\" (UID: \"df06ee4d-da4b-4812-876f-8b39a0419cca\") " pod="openshift-insights/insights-operator-585dfdc468-rlbzq" Apr 17 09:13:13.362021 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:13.361975 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/df06ee4d-da4b-4812-876f-8b39a0419cca-tmp\") pod \"insights-operator-585dfdc468-rlbzq\" (UID: \"df06ee4d-da4b-4812-876f-8b39a0419cca\") " pod="openshift-insights/insights-operator-585dfdc468-rlbzq" Apr 17 09:13:13.362085 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:13.362065 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/df06ee4d-da4b-4812-876f-8b39a0419cca-snapshots\") pod \"insights-operator-585dfdc468-rlbzq\" (UID: \"df06ee4d-da4b-4812-876f-8b39a0419cca\") " pod="openshift-insights/insights-operator-585dfdc468-rlbzq" Apr 17 09:13:13.362170 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:13.362153 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df06ee4d-da4b-4812-876f-8b39a0419cca-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-rlbzq\" (UID: \"df06ee4d-da4b-4812-876f-8b39a0419cca\") " pod="openshift-insights/insights-operator-585dfdc468-rlbzq" Apr 17 09:13:13.363729 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:13.363711 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df06ee4d-da4b-4812-876f-8b39a0419cca-serving-cert\") pod \"insights-operator-585dfdc468-rlbzq\" (UID: \"df06ee4d-da4b-4812-876f-8b39a0419cca\") " pod="openshift-insights/insights-operator-585dfdc468-rlbzq" Apr 17 09:13:13.373983 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:13.373961 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhtg9\" (UniqueName: \"kubernetes.io/projected/df06ee4d-da4b-4812-876f-8b39a0419cca-kube-api-access-qhtg9\") pod \"insights-operator-585dfdc468-rlbzq\" (UID: \"df06ee4d-da4b-4812-876f-8b39a0419cca\") " pod="openshift-insights/insights-operator-585dfdc468-rlbzq" Apr 17 09:13:13.423929 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:13.423910 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-rlbzq" Apr 17 09:13:13.462677 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:13.462653 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x4gwv\" (UniqueName: \"kubernetes.io/projected/1e2a7bc9-c500-464d-a161-c668f67f1430-kube-api-access-x4gwv\") pod \"cluster-monitoring-operator-75587bd455-tkbrp\" (UID: \"1e2a7bc9-c500-464d-a161-c668f67f1430\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tkbrp" Apr 17 09:13:13.462798 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:13.462695 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/1e2a7bc9-c500-464d-a161-c668f67f1430-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-tkbrp\" (UID: \"1e2a7bc9-c500-464d-a161-c668f67f1430\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tkbrp" Apr 17 09:13:13.462994 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:13.462970 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/1e2a7bc9-c500-464d-a161-c668f67f1430-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tkbrp\" (UID: \"1e2a7bc9-c500-464d-a161-c668f67f1430\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tkbrp" Apr 17 09:13:13.463065 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:13:13.463036 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 09:13:13.463112 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:13:13.463097 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e2a7bc9-c500-464d-a161-c668f67f1430-cluster-monitoring-operator-tls podName:1e2a7bc9-c500-464d-a161-c668f67f1430 nodeName:}" failed. No retries permitted until 2026-04-17 09:13:13.96308354 +0000 UTC m=+118.865140899 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/1e2a7bc9-c500-464d-a161-c668f67f1430-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tkbrp" (UID: "1e2a7bc9-c500-464d-a161-c668f67f1430") : secret "cluster-monitoring-operator-tls" not found Apr 17 09:13:13.463336 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:13.463320 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/1e2a7bc9-c500-464d-a161-c668f67f1430-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-tkbrp\" (UID: \"1e2a7bc9-c500-464d-a161-c668f67f1430\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tkbrp" Apr 17 09:13:13.474571 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:13.474548 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4gwv\" (UniqueName: \"kubernetes.io/projected/1e2a7bc9-c500-464d-a161-c668f67f1430-kube-api-access-x4gwv\") pod \"cluster-monitoring-operator-75587bd455-tkbrp\" (UID: \"1e2a7bc9-c500-464d-a161-c668f67f1430\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tkbrp" Apr 17 09:13:13.534872 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:13.534828 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-rlbzq"] Apr 17 09:13:13.537776 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:13:13.537750 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf06ee4d_da4b_4812_876f_8b39a0419cca.slice/crio-0ff4fdd5ad399979aa5e22dec0af02d22d9f10f14f1d3d7ed7db1f5fd93d92e3 WatchSource:0}: Error finding container 0ff4fdd5ad399979aa5e22dec0af02d22d9f10f14f1d3d7ed7db1f5fd93d92e3: Status 404 returned error can't find the container with id 0ff4fdd5ad399979aa5e22dec0af02d22d9f10f14f1d3d7ed7db1f5fd93d92e3 Apr 17 09:13:13.967062 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:13.967025 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/1e2a7bc9-c500-464d-a161-c668f67f1430-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tkbrp\" (UID: \"1e2a7bc9-c500-464d-a161-c668f67f1430\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tkbrp" Apr 17 09:13:13.967222 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:13:13.967164 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 09:13:13.967263 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:13:13.967225 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e2a7bc9-c500-464d-a161-c668f67f1430-cluster-monitoring-operator-tls podName:1e2a7bc9-c500-464d-a161-c668f67f1430 nodeName:}" failed. No retries permitted until 2026-04-17 09:13:14.967210421 +0000 UTC m=+119.869267780 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/1e2a7bc9-c500-464d-a161-c668f67f1430-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tkbrp" (UID: "1e2a7bc9-c500-464d-a161-c668f67f1430") : secret "cluster-monitoring-operator-tls" not found Apr 17 09:13:14.085555 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:14.085520 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-rlbzq" event={"ID":"df06ee4d-da4b-4812-876f-8b39a0419cca","Type":"ContainerStarted","Data":"0ff4fdd5ad399979aa5e22dec0af02d22d9f10f14f1d3d7ed7db1f5fd93d92e3"} Apr 17 09:13:14.975166 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:14.975118 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/1e2a7bc9-c500-464d-a161-c668f67f1430-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tkbrp\" (UID: \"1e2a7bc9-c500-464d-a161-c668f67f1430\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tkbrp" Apr 17 09:13:14.975569 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:13:14.975216 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 09:13:14.975569 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:13:14.975279 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e2a7bc9-c500-464d-a161-c668f67f1430-cluster-monitoring-operator-tls podName:1e2a7bc9-c500-464d-a161-c668f67f1430 nodeName:}" failed. No retries permitted until 2026-04-17 09:13:16.975261583 +0000 UTC m=+121.877318959 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/1e2a7bc9-c500-464d-a161-c668f67f1430-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tkbrp" (UID: "1e2a7bc9-c500-464d-a161-c668f67f1430") : secret "cluster-monitoring-operator-tls" not found Apr 17 09:13:16.090324 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:16.090292 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-rlbzq" event={"ID":"df06ee4d-da4b-4812-876f-8b39a0419cca","Type":"ContainerStarted","Data":"bc67b0da56e6243ebc2cd3bc4f883ccd2e8f154e8bdf062cc1f5c9975b10c1cd"} Apr 17 09:13:16.110389 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:16.110340 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-rlbzq" podStartSLOduration=1.211627934 podStartE2EDuration="3.110323863s" podCreationTimestamp="2026-04-17 09:13:13 +0000 UTC" firstStartedPulling="2026-04-17 09:13:13.539567748 +0000 UTC m=+118.441625106" lastFinishedPulling="2026-04-17 09:13:15.438263676 +0000 UTC m=+120.340321035" observedRunningTime="2026-04-17 09:13:16.108441063 +0000 UTC m=+121.010498445" watchObservedRunningTime="2026-04-17 09:13:16.110323863 +0000 UTC m=+121.012381245" Apr 17 09:13:16.989929 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:16.989888 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/1e2a7bc9-c500-464d-a161-c668f67f1430-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tkbrp\" (UID: \"1e2a7bc9-c500-464d-a161-c668f67f1430\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tkbrp" Apr 17 09:13:16.990091 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:13:16.990031 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 09:13:16.990091 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:13:16.990090 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e2a7bc9-c500-464d-a161-c668f67f1430-cluster-monitoring-operator-tls podName:1e2a7bc9-c500-464d-a161-c668f67f1430 nodeName:}" failed. No retries permitted until 2026-04-17 09:13:20.990073892 +0000 UTC m=+125.892131271 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/1e2a7bc9-c500-464d-a161-c668f67f1430-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tkbrp" (UID: "1e2a7bc9-c500-464d-a161-c668f67f1430") : secret "cluster-monitoring-operator-tls" not found Apr 17 09:13:18.666899 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:18.666865 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-759cb445bc-ttncw"] Apr 17 09:13:18.669603 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:18.669587 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-759cb445bc-ttncw" Apr 17 09:13:18.672335 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:18.672312 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 09:13:18.672447 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:18.672314 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 09:13:18.672447 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:18.672365 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 09:13:18.672447 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:18.672418 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-v9xfs\"" Apr 17 09:13:18.678620 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:18.678601 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 09:13:18.678739 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:18.678713 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-759cb445bc-ttncw"] Apr 17 09:13:18.802318 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:18.802294 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-image-registry-private-configuration\") pod \"image-registry-759cb445bc-ttncw\" (UID: \"6936dfa6-312b-4b02-81ab-70c7ec72b7d4\") " pod="openshift-image-registry/image-registry-759cb445bc-ttncw" Apr 17 09:13:18.802423 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:18.802339 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-registry-certificates\") pod \"image-registry-759cb445bc-ttncw\" (UID: \"6936dfa6-312b-4b02-81ab-70c7ec72b7d4\") " pod="openshift-image-registry/image-registry-759cb445bc-ttncw" Apr 17 09:13:18.802423 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:18.802392 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-registry-tls\") pod \"image-registry-759cb445bc-ttncw\" (UID: \"6936dfa6-312b-4b02-81ab-70c7ec72b7d4\") " pod="openshift-image-registry/image-registry-759cb445bc-ttncw" Apr 17 09:13:18.802423 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:18.802419 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-ca-trust-extracted\") pod \"image-registry-759cb445bc-ttncw\" (UID: \"6936dfa6-312b-4b02-81ab-70c7ec72b7d4\") " pod="openshift-image-registry/image-registry-759cb445bc-ttncw" Apr 17 09:13:18.802559 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:18.802435 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-trusted-ca\") pod \"image-registry-759cb445bc-ttncw\" (UID: \"6936dfa6-312b-4b02-81ab-70c7ec72b7d4\") " pod="openshift-image-registry/image-registry-759cb445bc-ttncw" Apr 17 09:13:18.802559 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:18.802478 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-bound-sa-token\") pod \"image-registry-759cb445bc-ttncw\" (UID: \"6936dfa6-312b-4b02-81ab-70c7ec72b7d4\") " pod="openshift-image-registry/image-registry-759cb445bc-ttncw" Apr 17 09:13:18.802559 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:18.802504 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbqts\" (UniqueName: \"kubernetes.io/projected/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-kube-api-access-fbqts\") pod \"image-registry-759cb445bc-ttncw\" (UID: \"6936dfa6-312b-4b02-81ab-70c7ec72b7d4\") " pod="openshift-image-registry/image-registry-759cb445bc-ttncw" Apr 17 09:13:18.802658 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:18.802555 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-installation-pull-secrets\") pod \"image-registry-759cb445bc-ttncw\" (UID: \"6936dfa6-312b-4b02-81ab-70c7ec72b7d4\") " pod="openshift-image-registry/image-registry-759cb445bc-ttncw" Apr 17 09:13:18.884047 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:18.884028 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-7lzx7_8e70ed4d-e5a1-4a10-931b-32fe40414a5a/dns-node-resolver/0.log" Apr 17 09:13:18.903262 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:18.903238 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-image-registry-private-configuration\") pod \"image-registry-759cb445bc-ttncw\" (UID: \"6936dfa6-312b-4b02-81ab-70c7ec72b7d4\") " pod="openshift-image-registry/image-registry-759cb445bc-ttncw" Apr 17 09:13:18.903352 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:18.903279 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-registry-certificates\") pod \"image-registry-759cb445bc-ttncw\" (UID: \"6936dfa6-312b-4b02-81ab-70c7ec72b7d4\") " pod="openshift-image-registry/image-registry-759cb445bc-ttncw" Apr 17 09:13:18.903352 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:18.903309 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-registry-tls\") pod \"image-registry-759cb445bc-ttncw\" (UID: \"6936dfa6-312b-4b02-81ab-70c7ec72b7d4\") " pod="openshift-image-registry/image-registry-759cb445bc-ttncw" Apr 17 09:13:18.903417 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:13:18.903386 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 09:13:18.903417 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:13:18.903399 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-759cb445bc-ttncw: secret "image-registry-tls" not found Apr 17 09:13:18.903491 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:18.903434 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-ca-trust-extracted\") pod \"image-registry-759cb445bc-ttncw\" (UID: \"6936dfa6-312b-4b02-81ab-70c7ec72b7d4\") " pod="openshift-image-registry/image-registry-759cb445bc-ttncw" Apr 17 09:13:18.903491 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:13:18.903458 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-registry-tls podName:6936dfa6-312b-4b02-81ab-70c7ec72b7d4 nodeName:}" failed. No retries permitted until 2026-04-17 09:13:19.403441807 +0000 UTC m=+124.305499167 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-registry-tls") pod "image-registry-759cb445bc-ttncw" (UID: "6936dfa6-312b-4b02-81ab-70c7ec72b7d4") : secret "image-registry-tls" not found Apr 17 09:13:18.903491 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:18.903484 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-trusted-ca\") pod \"image-registry-759cb445bc-ttncw\" (UID: \"6936dfa6-312b-4b02-81ab-70c7ec72b7d4\") " pod="openshift-image-registry/image-registry-759cb445bc-ttncw" Apr 17 09:13:18.903624 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:18.903519 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-bound-sa-token\") pod \"image-registry-759cb445bc-ttncw\" (UID: \"6936dfa6-312b-4b02-81ab-70c7ec72b7d4\") " pod="openshift-image-registry/image-registry-759cb445bc-ttncw" Apr 17 09:13:18.903624 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:18.903554 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fbqts\" (UniqueName: \"kubernetes.io/projected/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-kube-api-access-fbqts\") pod \"image-registry-759cb445bc-ttncw\" (UID: \"6936dfa6-312b-4b02-81ab-70c7ec72b7d4\") " pod="openshift-image-registry/image-registry-759cb445bc-ttncw" Apr 17 09:13:18.903624 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:18.903591 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-installation-pull-secrets\") pod \"image-registry-759cb445bc-ttncw\" (UID: \"6936dfa6-312b-4b02-81ab-70c7ec72b7d4\") " pod="openshift-image-registry/image-registry-759cb445bc-ttncw" Apr 17 09:13:18.903806 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:18.903785 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-ca-trust-extracted\") pod \"image-registry-759cb445bc-ttncw\" (UID: \"6936dfa6-312b-4b02-81ab-70c7ec72b7d4\") " pod="openshift-image-registry/image-registry-759cb445bc-ttncw" Apr 17 09:13:18.903973 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:18.903947 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-registry-certificates\") pod \"image-registry-759cb445bc-ttncw\" (UID: \"6936dfa6-312b-4b02-81ab-70c7ec72b7d4\") " pod="openshift-image-registry/image-registry-759cb445bc-ttncw" Apr 17 09:13:18.904932 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:18.904913 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-trusted-ca\") pod \"image-registry-759cb445bc-ttncw\" (UID: \"6936dfa6-312b-4b02-81ab-70c7ec72b7d4\") " pod="openshift-image-registry/image-registry-759cb445bc-ttncw" Apr 17 09:13:18.905676 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:18.905660 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-image-registry-private-configuration\") pod \"image-registry-759cb445bc-ttncw\" (UID: \"6936dfa6-312b-4b02-81ab-70c7ec72b7d4\") " pod="openshift-image-registry/image-registry-759cb445bc-ttncw" Apr 17 09:13:18.905757 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:18.905741 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-installation-pull-secrets\") pod \"image-registry-759cb445bc-ttncw\" (UID: \"6936dfa6-312b-4b02-81ab-70c7ec72b7d4\") " pod="openshift-image-registry/image-registry-759cb445bc-ttncw" Apr 17 09:13:18.911895 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:18.911876 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-bound-sa-token\") pod \"image-registry-759cb445bc-ttncw\" (UID: \"6936dfa6-312b-4b02-81ab-70c7ec72b7d4\") " pod="openshift-image-registry/image-registry-759cb445bc-ttncw" Apr 17 09:13:18.912452 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:18.912434 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbqts\" (UniqueName: \"kubernetes.io/projected/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-kube-api-access-fbqts\") pod \"image-registry-759cb445bc-ttncw\" (UID: \"6936dfa6-312b-4b02-81ab-70c7ec72b7d4\") " pod="openshift-image-registry/image-registry-759cb445bc-ttncw" Apr 17 09:13:19.407395 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:19.407359 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-registry-tls\") pod \"image-registry-759cb445bc-ttncw\" (UID: \"6936dfa6-312b-4b02-81ab-70c7ec72b7d4\") " pod="openshift-image-registry/image-registry-759cb445bc-ttncw" Apr 17 09:13:19.407570 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:13:19.407500 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 09:13:19.407570 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:13:19.407516 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-759cb445bc-ttncw: secret "image-registry-tls" not found Apr 17 09:13:19.407570 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:13:19.407569 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-registry-tls podName:6936dfa6-312b-4b02-81ab-70c7ec72b7d4 nodeName:}" failed. No retries permitted until 2026-04-17 09:13:20.407553821 +0000 UTC m=+125.309611188 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-registry-tls") pod "image-registry-759cb445bc-ttncw" (UID: "6936dfa6-312b-4b02-81ab-70c7ec72b7d4") : secret "image-registry-tls" not found Apr 17 09:13:19.884181 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:19.884150 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-mwm8s_054fc5ee-b86e-42a4-85c2-322e7ca088cf/node-ca/0.log" Apr 17 09:13:20.021048 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:20.021016 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kbtlr"] Apr 17 09:13:20.025077 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:20.025060 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kbtlr" Apr 17 09:13:20.027815 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:20.027785 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 17 09:13:20.027815 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:20.027806 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 17 09:13:20.027970 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:20.027787 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 17 09:13:20.027970 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:20.027862 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 17 09:13:20.028852 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:20.028821 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-vr6q5\"" Apr 17 09:13:20.033373 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:20.033351 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kbtlr"] Apr 17 09:13:20.111823 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:20.111793 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkrbr\" (UniqueName: \"kubernetes.io/projected/153b51ca-f712-4926-8c50-8e76eed97427-kube-api-access-bkrbr\") pod \"service-ca-operator-d6fc45fc5-kbtlr\" (UID: \"153b51ca-f712-4926-8c50-8e76eed97427\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kbtlr" Apr 17 09:13:20.111993 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:20.111898 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/153b51ca-f712-4926-8c50-8e76eed97427-serving-cert\") pod \"service-ca-operator-d6fc45fc5-kbtlr\" (UID: \"153b51ca-f712-4926-8c50-8e76eed97427\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kbtlr" Apr 17 09:13:20.111993 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:20.111930 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/153b51ca-f712-4926-8c50-8e76eed97427-config\") pod \"service-ca-operator-d6fc45fc5-kbtlr\" (UID: \"153b51ca-f712-4926-8c50-8e76eed97427\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kbtlr" Apr 17 09:13:20.212784 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:20.212708 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/153b51ca-f712-4926-8c50-8e76eed97427-serving-cert\") pod \"service-ca-operator-d6fc45fc5-kbtlr\" (UID: \"153b51ca-f712-4926-8c50-8e76eed97427\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kbtlr" Apr 17 09:13:20.212784 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:20.212737 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/153b51ca-f712-4926-8c50-8e76eed97427-config\") pod \"service-ca-operator-d6fc45fc5-kbtlr\" (UID: \"153b51ca-f712-4926-8c50-8e76eed97427\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kbtlr" Apr 17 09:13:20.213003 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:20.212799 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bkrbr\" (UniqueName: \"kubernetes.io/projected/153b51ca-f712-4926-8c50-8e76eed97427-kube-api-access-bkrbr\") pod \"service-ca-operator-d6fc45fc5-kbtlr\" (UID: \"153b51ca-f712-4926-8c50-8e76eed97427\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kbtlr" Apr 17 09:13:20.213277 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:20.213257 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/153b51ca-f712-4926-8c50-8e76eed97427-config\") pod \"service-ca-operator-d6fc45fc5-kbtlr\" (UID: \"153b51ca-f712-4926-8c50-8e76eed97427\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kbtlr" Apr 17 09:13:20.214947 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:20.214927 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/153b51ca-f712-4926-8c50-8e76eed97427-serving-cert\") pod \"service-ca-operator-d6fc45fc5-kbtlr\" (UID: \"153b51ca-f712-4926-8c50-8e76eed97427\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kbtlr" Apr 17 09:13:20.220902 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:20.220883 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkrbr\" (UniqueName: \"kubernetes.io/projected/153b51ca-f712-4926-8c50-8e76eed97427-kube-api-access-bkrbr\") pod \"service-ca-operator-d6fc45fc5-kbtlr\" (UID: \"153b51ca-f712-4926-8c50-8e76eed97427\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kbtlr" Apr 17 09:13:20.334124 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:20.334094 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kbtlr" Apr 17 09:13:20.414722 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:20.414689 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-registry-tls\") pod \"image-registry-759cb445bc-ttncw\" (UID: \"6936dfa6-312b-4b02-81ab-70c7ec72b7d4\") " pod="openshift-image-registry/image-registry-759cb445bc-ttncw" Apr 17 09:13:20.414856 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:13:20.414822 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 09:13:20.414856 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:13:20.414852 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-759cb445bc-ttncw: secret "image-registry-tls" not found Apr 17 09:13:20.414928 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:13:20.414915 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-registry-tls podName:6936dfa6-312b-4b02-81ab-70c7ec72b7d4 nodeName:}" failed. No retries permitted until 2026-04-17 09:13:22.414896007 +0000 UTC m=+127.316953370 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-registry-tls") pod "image-registry-759cb445bc-ttncw" (UID: "6936dfa6-312b-4b02-81ab-70c7ec72b7d4") : secret "image-registry-tls" not found Apr 17 09:13:20.442084 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:20.442056 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kbtlr"] Apr 17 09:13:20.444863 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:13:20.444823 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod153b51ca_f712_4926_8c50_8e76eed97427.slice/crio-6ded33791e737f710f2d3e15819b5d6fd12d92673d0a6f621f6e540e73fecdbf WatchSource:0}: Error finding container 6ded33791e737f710f2d3e15819b5d6fd12d92673d0a6f621f6e540e73fecdbf: Status 404 returned error can't find the container with id 6ded33791e737f710f2d3e15819b5d6fd12d92673d0a6f621f6e540e73fecdbf Apr 17 09:13:21.018828 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:21.018790 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/1e2a7bc9-c500-464d-a161-c668f67f1430-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tkbrp\" (UID: \"1e2a7bc9-c500-464d-a161-c668f67f1430\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tkbrp" Apr 17 09:13:21.019337 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:13:21.018959 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 09:13:21.019337 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:13:21.019051 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e2a7bc9-c500-464d-a161-c668f67f1430-cluster-monitoring-operator-tls podName:1e2a7bc9-c500-464d-a161-c668f67f1430 nodeName:}" failed. No retries permitted until 2026-04-17 09:13:29.019028146 +0000 UTC m=+133.921085509 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/1e2a7bc9-c500-464d-a161-c668f67f1430-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tkbrp" (UID: "1e2a7bc9-c500-464d-a161-c668f67f1430") : secret "cluster-monitoring-operator-tls" not found Apr 17 09:13:21.100211 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:21.100176 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kbtlr" event={"ID":"153b51ca-f712-4926-8c50-8e76eed97427","Type":"ContainerStarted","Data":"6ded33791e737f710f2d3e15819b5d6fd12d92673d0a6f621f6e540e73fecdbf"} Apr 17 09:13:21.194456 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:21.194423 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-2tbn5"] Apr 17 09:13:21.197327 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:21.197306 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-2tbn5" Apr 17 09:13:21.199951 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:21.199921 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 17 09:13:21.200999 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:21.200981 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-7l2cq\"" Apr 17 09:13:21.201084 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:21.200989 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 17 09:13:21.204491 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:21.204470 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-2tbn5"] Apr 17 09:13:21.321713 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:21.321636 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dkkc\" (UniqueName: \"kubernetes.io/projected/7ce72122-fc1c-425c-84e0-f0f52bc442e2-kube-api-access-2dkkc\") pod \"migrator-74bb7799d9-2tbn5\" (UID: \"7ce72122-fc1c-425c-84e0-f0f52bc442e2\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-2tbn5" Apr 17 09:13:21.422946 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:21.422908 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2dkkc\" (UniqueName: \"kubernetes.io/projected/7ce72122-fc1c-425c-84e0-f0f52bc442e2-kube-api-access-2dkkc\") pod \"migrator-74bb7799d9-2tbn5\" (UID: \"7ce72122-fc1c-425c-84e0-f0f52bc442e2\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-2tbn5" Apr 17 09:13:21.431332 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:21.431306 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dkkc\" (UniqueName: \"kubernetes.io/projected/7ce72122-fc1c-425c-84e0-f0f52bc442e2-kube-api-access-2dkkc\") pod \"migrator-74bb7799d9-2tbn5\" (UID: \"7ce72122-fc1c-425c-84e0-f0f52bc442e2\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-2tbn5" Apr 17 09:13:21.508464 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:21.508431 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-2tbn5" Apr 17 09:13:21.623372 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:21.623337 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-2tbn5"] Apr 17 09:13:21.627246 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:13:21.627218 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ce72122_fc1c_425c_84e0_f0f52bc442e2.slice/crio-54fce1c3b0aaf511c388d1462f492aa0563918bf3e943547e33e2ee60b1f6922 WatchSource:0}: Error finding container 54fce1c3b0aaf511c388d1462f492aa0563918bf3e943547e33e2ee60b1f6922: Status 404 returned error can't find the container with id 54fce1c3b0aaf511c388d1462f492aa0563918bf3e943547e33e2ee60b1f6922 Apr 17 09:13:22.103333 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:22.103298 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-2tbn5" event={"ID":"7ce72122-fc1c-425c-84e0-f0f52bc442e2","Type":"ContainerStarted","Data":"54fce1c3b0aaf511c388d1462f492aa0563918bf3e943547e33e2ee60b1f6922"} Apr 17 09:13:22.431056 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:22.431022 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-registry-tls\") pod \"image-registry-759cb445bc-ttncw\" (UID: \"6936dfa6-312b-4b02-81ab-70c7ec72b7d4\") " pod="openshift-image-registry/image-registry-759cb445bc-ttncw" Apr 17 09:13:22.431183 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:13:22.431155 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 09:13:22.431183 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:13:22.431170 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-759cb445bc-ttncw: secret "image-registry-tls" not found Apr 17 09:13:22.431255 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:13:22.431221 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-registry-tls podName:6936dfa6-312b-4b02-81ab-70c7ec72b7d4 nodeName:}" failed. No retries permitted until 2026-04-17 09:13:26.431204045 +0000 UTC m=+131.333261417 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-registry-tls") pod "image-registry-759cb445bc-ttncw" (UID: "6936dfa6-312b-4b02-81ab-70c7ec72b7d4") : secret "image-registry-tls" not found Apr 17 09:13:23.106244 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:23.106205 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kbtlr" event={"ID":"153b51ca-f712-4926-8c50-8e76eed97427","Type":"ContainerStarted","Data":"f28f944ed6570797fba8e8538daebffea9449de9229f816376477a670bbeadbd"} Apr 17 09:13:23.107465 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:23.107443 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-2tbn5" event={"ID":"7ce72122-fc1c-425c-84e0-f0f52bc442e2","Type":"ContainerStarted","Data":"a7067f1d0cd3dc7a56799111b25df06b61356850c90d965e0da8cd78a5b401e4"} Apr 17 09:13:23.126072 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:23.126027 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kbtlr" podStartSLOduration=1.2649820809999999 podStartE2EDuration="3.126010635s" podCreationTimestamp="2026-04-17 09:13:20 +0000 UTC" firstStartedPulling="2026-04-17 09:13:20.446555301 +0000 UTC m=+125.348612663" lastFinishedPulling="2026-04-17 09:13:22.307583854 +0000 UTC m=+127.209641217" observedRunningTime="2026-04-17 09:13:23.12513625 +0000 UTC m=+128.027193642" watchObservedRunningTime="2026-04-17 09:13:23.126010635 +0000 UTC m=+128.028068017" Apr 17 09:13:24.110823 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:24.110754 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-2tbn5" event={"ID":"7ce72122-fc1c-425c-84e0-f0f52bc442e2","Type":"ContainerStarted","Data":"a56f07ded283a06da4e5a952a3708ab78e88b8a6c088eea4c03e09a9a56c977b"} Apr 17 09:13:24.126961 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:24.126917 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-2tbn5" podStartSLOduration=1.760505639 podStartE2EDuration="3.126904927s" podCreationTimestamp="2026-04-17 09:13:21 +0000 UTC" firstStartedPulling="2026-04-17 09:13:21.629540781 +0000 UTC m=+126.531598145" lastFinishedPulling="2026-04-17 09:13:22.99594006 +0000 UTC m=+127.897997433" observedRunningTime="2026-04-17 09:13:24.126310255 +0000 UTC m=+129.028367633" watchObservedRunningTime="2026-04-17 09:13:24.126904927 +0000 UTC m=+129.028962308" Apr 17 09:13:24.346612 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:24.346575 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea2ee429-d7fa-4703-99bd-5d963ebab30c-metrics-certs\") pod \"network-metrics-daemon-4h6v9\" (UID: \"ea2ee429-d7fa-4703-99bd-5d963ebab30c\") " pod="openshift-multus/network-metrics-daemon-4h6v9" Apr 17 09:13:24.346773 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:13:24.346688 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 09:13:24.346773 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:13:24.346745 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea2ee429-d7fa-4703-99bd-5d963ebab30c-metrics-certs podName:ea2ee429-d7fa-4703-99bd-5d963ebab30c nodeName:}" failed. No retries permitted until 2026-04-17 09:15:26.346726371 +0000 UTC m=+251.248783733 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea2ee429-d7fa-4703-99bd-5d963ebab30c-metrics-certs") pod "network-metrics-daemon-4h6v9" (UID: "ea2ee429-d7fa-4703-99bd-5d963ebab30c") : secret "metrics-daemon-secret" not found Apr 17 09:13:26.297466 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:26.297426 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-2ns74"] Apr 17 09:13:26.300429 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:26.300412 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-2ns74" Apr 17 09:13:26.303152 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:26.303127 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 17 09:13:26.303269 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:26.303157 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 17 09:13:26.303269 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:26.303241 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 17 09:13:26.303351 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:26.303289 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 17 09:13:26.304446 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:26.304432 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-wn5dt\"" Apr 17 09:13:26.310648 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:26.310629 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-2ns74"] Apr 17 09:13:26.463999 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:26.463970 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qht6p\" (UniqueName: \"kubernetes.io/projected/b509e416-1dc1-4e60-84d5-82b1047cf091-kube-api-access-qht6p\") pod \"service-ca-865cb79987-2ns74\" (UID: \"b509e416-1dc1-4e60-84d5-82b1047cf091\") " pod="openshift-service-ca/service-ca-865cb79987-2ns74" Apr 17 09:13:26.464119 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:26.464018 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b509e416-1dc1-4e60-84d5-82b1047cf091-signing-cabundle\") pod \"service-ca-865cb79987-2ns74\" (UID: \"b509e416-1dc1-4e60-84d5-82b1047cf091\") " pod="openshift-service-ca/service-ca-865cb79987-2ns74" Apr 17 09:13:26.464119 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:26.464096 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-registry-tls\") pod \"image-registry-759cb445bc-ttncw\" (UID: \"6936dfa6-312b-4b02-81ab-70c7ec72b7d4\") " pod="openshift-image-registry/image-registry-759cb445bc-ttncw" Apr 17 09:13:26.464199 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:26.464145 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b509e416-1dc1-4e60-84d5-82b1047cf091-signing-key\") pod \"service-ca-865cb79987-2ns74\" (UID: \"b509e416-1dc1-4e60-84d5-82b1047cf091\") " pod="openshift-service-ca/service-ca-865cb79987-2ns74" Apr 17 09:13:26.464199 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:13:26.464167 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 09:13:26.464199 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:13:26.464178 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-759cb445bc-ttncw: secret "image-registry-tls" not found Apr 17 09:13:26.464287 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:13:26.464221 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-registry-tls podName:6936dfa6-312b-4b02-81ab-70c7ec72b7d4 nodeName:}" failed. No retries permitted until 2026-04-17 09:13:34.464206414 +0000 UTC m=+139.366263772 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-registry-tls") pod "image-registry-759cb445bc-ttncw" (UID: "6936dfa6-312b-4b02-81ab-70c7ec72b7d4") : secret "image-registry-tls" not found Apr 17 09:13:26.565063 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:26.565002 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b509e416-1dc1-4e60-84d5-82b1047cf091-signing-key\") pod \"service-ca-865cb79987-2ns74\" (UID: \"b509e416-1dc1-4e60-84d5-82b1047cf091\") " pod="openshift-service-ca/service-ca-865cb79987-2ns74" Apr 17 09:13:26.565063 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:26.565034 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qht6p\" (UniqueName: \"kubernetes.io/projected/b509e416-1dc1-4e60-84d5-82b1047cf091-kube-api-access-qht6p\") pod \"service-ca-865cb79987-2ns74\" (UID: \"b509e416-1dc1-4e60-84d5-82b1047cf091\") " pod="openshift-service-ca/service-ca-865cb79987-2ns74" Apr 17 09:13:26.565205 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:26.565065 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b509e416-1dc1-4e60-84d5-82b1047cf091-signing-cabundle\") pod \"service-ca-865cb79987-2ns74\" (UID: \"b509e416-1dc1-4e60-84d5-82b1047cf091\") " pod="openshift-service-ca/service-ca-865cb79987-2ns74" Apr 17 09:13:26.565577 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:26.565557 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b509e416-1dc1-4e60-84d5-82b1047cf091-signing-cabundle\") pod \"service-ca-865cb79987-2ns74\" (UID: \"b509e416-1dc1-4e60-84d5-82b1047cf091\") " pod="openshift-service-ca/service-ca-865cb79987-2ns74" Apr 17 09:13:26.567350 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:26.567326 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b509e416-1dc1-4e60-84d5-82b1047cf091-signing-key\") pod \"service-ca-865cb79987-2ns74\" (UID: \"b509e416-1dc1-4e60-84d5-82b1047cf091\") " pod="openshift-service-ca/service-ca-865cb79987-2ns74" Apr 17 09:13:26.573465 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:26.573442 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qht6p\" (UniqueName: \"kubernetes.io/projected/b509e416-1dc1-4e60-84d5-82b1047cf091-kube-api-access-qht6p\") pod \"service-ca-865cb79987-2ns74\" (UID: \"b509e416-1dc1-4e60-84d5-82b1047cf091\") " pod="openshift-service-ca/service-ca-865cb79987-2ns74" Apr 17 09:13:26.609343 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:26.609323 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-2ns74" Apr 17 09:13:26.719254 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:26.719221 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-2ns74"] Apr 17 09:13:26.723356 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:13:26.723325 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb509e416_1dc1_4e60_84d5_82b1047cf091.slice/crio-9f437c7c878ab17f86250b464d4529594ec6a81bbf3a59143a1fe86535219e84 WatchSource:0}: Error finding container 9f437c7c878ab17f86250b464d4529594ec6a81bbf3a59143a1fe86535219e84: Status 404 returned error can't find the container with id 9f437c7c878ab17f86250b464d4529594ec6a81bbf3a59143a1fe86535219e84 Apr 17 09:13:27.119339 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:27.119305 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-2ns74" event={"ID":"b509e416-1dc1-4e60-84d5-82b1047cf091","Type":"ContainerStarted","Data":"ab74465ea55a912655a8e00cb087528d675565c3e21eac647184c52351bac314"} Apr 17 09:13:27.119339 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:27.119340 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-2ns74" event={"ID":"b509e416-1dc1-4e60-84d5-82b1047cf091","Type":"ContainerStarted","Data":"9f437c7c878ab17f86250b464d4529594ec6a81bbf3a59143a1fe86535219e84"} Apr 17 09:13:27.144889 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:27.144847 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-2ns74" podStartSLOduration=1.144824504 podStartE2EDuration="1.144824504s" podCreationTimestamp="2026-04-17 09:13:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 09:13:27.144629932 +0000 UTC m=+132.046687313" watchObservedRunningTime="2026-04-17 09:13:27.144824504 +0000 UTC m=+132.046881885" Apr 17 09:13:29.083209 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:29.083170 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/1e2a7bc9-c500-464d-a161-c668f67f1430-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tkbrp\" (UID: \"1e2a7bc9-c500-464d-a161-c668f67f1430\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tkbrp" Apr 17 09:13:29.083618 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:13:29.083307 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 09:13:29.083618 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:13:29.083382 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e2a7bc9-c500-464d-a161-c668f67f1430-cluster-monitoring-operator-tls podName:1e2a7bc9-c500-464d-a161-c668f67f1430 nodeName:}" failed. No retries permitted until 2026-04-17 09:13:45.083363208 +0000 UTC m=+149.985420572 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/1e2a7bc9-c500-464d-a161-c668f67f1430-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tkbrp" (UID: "1e2a7bc9-c500-464d-a161-c668f67f1430") : secret "cluster-monitoring-operator-tls" not found Apr 17 09:13:34.522286 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:34.522240 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-registry-tls\") pod \"image-registry-759cb445bc-ttncw\" (UID: \"6936dfa6-312b-4b02-81ab-70c7ec72b7d4\") " pod="openshift-image-registry/image-registry-759cb445bc-ttncw" Apr 17 09:13:34.525226 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:34.525199 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-registry-tls\") pod \"image-registry-759cb445bc-ttncw\" (UID: \"6936dfa6-312b-4b02-81ab-70c7ec72b7d4\") " pod="openshift-image-registry/image-registry-759cb445bc-ttncw" Apr 17 09:13:34.578463 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:34.578429 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-759cb445bc-ttncw" Apr 17 09:13:34.698456 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:34.698423 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-759cb445bc-ttncw"] Apr 17 09:13:34.701274 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:13:34.701237 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6936dfa6_312b_4b02_81ab_70c7ec72b7d4.slice/crio-519ddfcec1e507f758035ee60491c9318d1181632b3a266ab6ff0c1be7ac8af7 WatchSource:0}: Error finding container 519ddfcec1e507f758035ee60491c9318d1181632b3a266ab6ff0c1be7ac8af7: Status 404 returned error can't find the container with id 519ddfcec1e507f758035ee60491c9318d1181632b3a266ab6ff0c1be7ac8af7 Apr 17 09:13:35.139352 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:35.139323 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-759cb445bc-ttncw" event={"ID":"6936dfa6-312b-4b02-81ab-70c7ec72b7d4","Type":"ContainerStarted","Data":"659e490c2069fc320272a44cfdb461ca41de67e781c0852c8090831fb464f1e9"} Apr 17 09:13:35.139352 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:35.139356 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-759cb445bc-ttncw" event={"ID":"6936dfa6-312b-4b02-81ab-70c7ec72b7d4","Type":"ContainerStarted","Data":"519ddfcec1e507f758035ee60491c9318d1181632b3a266ab6ff0c1be7ac8af7"} Apr 17 09:13:35.139600 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:35.139474 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-759cb445bc-ttncw" Apr 17 09:13:35.162056 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:35.162002 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-759cb445bc-ttncw" podStartSLOduration=17.161985454 podStartE2EDuration="17.161985454s" podCreationTimestamp="2026-04-17 09:13:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 09:13:35.160343376 +0000 UTC m=+140.062400759" watchObservedRunningTime="2026-04-17 09:13:35.161985454 +0000 UTC m=+140.064042836" Apr 17 09:13:44.600453 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:44.600420 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-759cb445bc-ttncw"] Apr 17 09:13:44.663945 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:44.663919 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-fmzk9"] Apr 17 09:13:44.668819 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:44.668804 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-fmzk9" Apr 17 09:13:44.671637 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:44.671610 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 09:13:44.671637 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:44.671631 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 09:13:44.673476 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:44.673458 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-f5jjb\"" Apr 17 09:13:44.679301 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:44.679282 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-fmzk9"] Apr 17 09:13:44.798884 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:44.798853 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a95dc232-5251-4fa8-b5fe-cd5f8fbb5d6f-crio-socket\") pod \"insights-runtime-extractor-fmzk9\" (UID: \"a95dc232-5251-4fa8-b5fe-cd5f8fbb5d6f\") " pod="openshift-insights/insights-runtime-extractor-fmzk9" Apr 17 09:13:44.799031 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:44.798893 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a95dc232-5251-4fa8-b5fe-cd5f8fbb5d6f-data-volume\") pod \"insights-runtime-extractor-fmzk9\" (UID: \"a95dc232-5251-4fa8-b5fe-cd5f8fbb5d6f\") " pod="openshift-insights/insights-runtime-extractor-fmzk9" Apr 17 09:13:44.799031 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:44.798934 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a95dc232-5251-4fa8-b5fe-cd5f8fbb5d6f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fmzk9\" (UID: \"a95dc232-5251-4fa8-b5fe-cd5f8fbb5d6f\") " pod="openshift-insights/insights-runtime-extractor-fmzk9" Apr 17 09:13:44.799031 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:44.798980 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmt2g\" (UniqueName: \"kubernetes.io/projected/a95dc232-5251-4fa8-b5fe-cd5f8fbb5d6f-kube-api-access-zmt2g\") pod \"insights-runtime-extractor-fmzk9\" (UID: \"a95dc232-5251-4fa8-b5fe-cd5f8fbb5d6f\") " pod="openshift-insights/insights-runtime-extractor-fmzk9" Apr 17 09:13:44.799031 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:44.799004 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a95dc232-5251-4fa8-b5fe-cd5f8fbb5d6f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fmzk9\" (UID: \"a95dc232-5251-4fa8-b5fe-cd5f8fbb5d6f\") " pod="openshift-insights/insights-runtime-extractor-fmzk9" Apr 17 09:13:44.899422 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:44.899347 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a95dc232-5251-4fa8-b5fe-cd5f8fbb5d6f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fmzk9\" (UID: \"a95dc232-5251-4fa8-b5fe-cd5f8fbb5d6f\") " pod="openshift-insights/insights-runtime-extractor-fmzk9" Apr 17 09:13:44.899550 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:44.899435 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a95dc232-5251-4fa8-b5fe-cd5f8fbb5d6f-crio-socket\") pod \"insights-runtime-extractor-fmzk9\" (UID: \"a95dc232-5251-4fa8-b5fe-cd5f8fbb5d6f\") " pod="openshift-insights/insights-runtime-extractor-fmzk9" Apr 17 09:13:44.899550 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:44.899465 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a95dc232-5251-4fa8-b5fe-cd5f8fbb5d6f-data-volume\") pod \"insights-runtime-extractor-fmzk9\" (UID: \"a95dc232-5251-4fa8-b5fe-cd5f8fbb5d6f\") " pod="openshift-insights/insights-runtime-extractor-fmzk9" Apr 17 09:13:44.899550 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:44.899505 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a95dc232-5251-4fa8-b5fe-cd5f8fbb5d6f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fmzk9\" (UID: \"a95dc232-5251-4fa8-b5fe-cd5f8fbb5d6f\") " pod="openshift-insights/insights-runtime-extractor-fmzk9" Apr 17 09:13:44.899678 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:44.899554 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zmt2g\" (UniqueName: \"kubernetes.io/projected/a95dc232-5251-4fa8-b5fe-cd5f8fbb5d6f-kube-api-access-zmt2g\") pod \"insights-runtime-extractor-fmzk9\" (UID: \"a95dc232-5251-4fa8-b5fe-cd5f8fbb5d6f\") " pod="openshift-insights/insights-runtime-extractor-fmzk9" Apr 17 09:13:44.899678 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:44.899550 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a95dc232-5251-4fa8-b5fe-cd5f8fbb5d6f-crio-socket\") pod \"insights-runtime-extractor-fmzk9\" (UID: \"a95dc232-5251-4fa8-b5fe-cd5f8fbb5d6f\") " pod="openshift-insights/insights-runtime-extractor-fmzk9" Apr 17 09:13:44.899894 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:44.899875 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a95dc232-5251-4fa8-b5fe-cd5f8fbb5d6f-data-volume\") pod \"insights-runtime-extractor-fmzk9\" (UID: \"a95dc232-5251-4fa8-b5fe-cd5f8fbb5d6f\") " pod="openshift-insights/insights-runtime-extractor-fmzk9" Apr 17 09:13:44.900170 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:44.900148 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a95dc232-5251-4fa8-b5fe-cd5f8fbb5d6f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fmzk9\" (UID: \"a95dc232-5251-4fa8-b5fe-cd5f8fbb5d6f\") " pod="openshift-insights/insights-runtime-extractor-fmzk9" Apr 17 09:13:44.901754 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:44.901734 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a95dc232-5251-4fa8-b5fe-cd5f8fbb5d6f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fmzk9\" (UID: \"a95dc232-5251-4fa8-b5fe-cd5f8fbb5d6f\") " pod="openshift-insights/insights-runtime-extractor-fmzk9" Apr 17 09:13:44.908222 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:44.908200 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmt2g\" (UniqueName: \"kubernetes.io/projected/a95dc232-5251-4fa8-b5fe-cd5f8fbb5d6f-kube-api-access-zmt2g\") pod \"insights-runtime-extractor-fmzk9\" (UID: \"a95dc232-5251-4fa8-b5fe-cd5f8fbb5d6f\") " pod="openshift-insights/insights-runtime-extractor-fmzk9" Apr 17 09:13:44.978204 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:44.978185 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-fmzk9" Apr 17 09:13:45.089331 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:45.089304 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-fmzk9"] Apr 17 09:13:45.092337 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:13:45.092310 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda95dc232_5251_4fa8_b5fe_cd5f8fbb5d6f.slice/crio-9b73b888e68e3d01d902a50813602b222448658301d2d6acac110eba9e575005 WatchSource:0}: Error finding container 9b73b888e68e3d01d902a50813602b222448658301d2d6acac110eba9e575005: Status 404 returned error can't find the container with id 9b73b888e68e3d01d902a50813602b222448658301d2d6acac110eba9e575005 Apr 17 09:13:45.101139 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:45.101112 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/1e2a7bc9-c500-464d-a161-c668f67f1430-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tkbrp\" (UID: \"1e2a7bc9-c500-464d-a161-c668f67f1430\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tkbrp" Apr 17 09:13:45.103390 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:45.103370 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/1e2a7bc9-c500-464d-a161-c668f67f1430-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tkbrp\" (UID: \"1e2a7bc9-c500-464d-a161-c668f67f1430\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tkbrp" Apr 17 09:13:45.166490 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:45.166462 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fmzk9" event={"ID":"a95dc232-5251-4fa8-b5fe-cd5f8fbb5d6f","Type":"ContainerStarted","Data":"adbd9eab663fd598848556bea348d2ec2513018657fe8a9bd55d209b4ed05fdb"} Apr 17 09:13:45.166600 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:45.166493 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fmzk9" event={"ID":"a95dc232-5251-4fa8-b5fe-cd5f8fbb5d6f","Type":"ContainerStarted","Data":"9b73b888e68e3d01d902a50813602b222448658301d2d6acac110eba9e575005"} Apr 17 09:13:45.330554 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:45.330528 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-426jd\"" Apr 17 09:13:45.338557 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:45.338532 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tkbrp" Apr 17 09:13:45.450018 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:45.449989 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-tkbrp"] Apr 17 09:13:45.453267 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:13:45.453243 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e2a7bc9_c500_464d_a161_c668f67f1430.slice/crio-96817577dad793531882e2b935d3091c577ce9c6d38fcc74d9f0006fe608633d WatchSource:0}: Error finding container 96817577dad793531882e2b935d3091c577ce9c6d38fcc74d9f0006fe608633d: Status 404 returned error can't find the container with id 96817577dad793531882e2b935d3091c577ce9c6d38fcc74d9f0006fe608633d Apr 17 09:13:46.170169 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:46.170079 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tkbrp" event={"ID":"1e2a7bc9-c500-464d-a161-c668f67f1430","Type":"ContainerStarted","Data":"96817577dad793531882e2b935d3091c577ce9c6d38fcc74d9f0006fe608633d"} Apr 17 09:13:46.171783 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:46.171753 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fmzk9" event={"ID":"a95dc232-5251-4fa8-b5fe-cd5f8fbb5d6f","Type":"ContainerStarted","Data":"0769797050c6976bd6366e3c35d68401e03dc01d2eed5cae22010281a9691693"} Apr 17 09:13:48.178620 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:48.178580 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fmzk9" event={"ID":"a95dc232-5251-4fa8-b5fe-cd5f8fbb5d6f","Type":"ContainerStarted","Data":"30ad11e0d4c8abd39df1276db3ec5cc9cddb104927871661d91d0c42bc8087c7"} Apr 17 09:13:48.179959 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:48.179933 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tkbrp" event={"ID":"1e2a7bc9-c500-464d-a161-c668f67f1430","Type":"ContainerStarted","Data":"c945a4f309d462fc4017eb7f222c136b6ca339884722624cac38ff87121b5abb"} Apr 17 09:13:48.200199 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:48.200149 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-fmzk9" podStartSLOduration=1.942924047 podStartE2EDuration="4.200136399s" podCreationTimestamp="2026-04-17 09:13:44 +0000 UTC" firstStartedPulling="2026-04-17 09:13:45.155709618 +0000 UTC m=+150.057766981" lastFinishedPulling="2026-04-17 09:13:47.412921971 +0000 UTC m=+152.314979333" observedRunningTime="2026-04-17 09:13:48.195803329 +0000 UTC m=+153.097860711" watchObservedRunningTime="2026-04-17 09:13:48.200136399 +0000 UTC m=+153.102193809" Apr 17 09:13:48.212549 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:48.212510 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tkbrp" podStartSLOduration=33.296006003 podStartE2EDuration="35.212500976s" podCreationTimestamp="2026-04-17 09:13:13 +0000 UTC" firstStartedPulling="2026-04-17 09:13:45.455059546 +0000 UTC m=+150.357116906" lastFinishedPulling="2026-04-17 09:13:47.371554517 +0000 UTC m=+152.273611879" observedRunningTime="2026-04-17 09:13:48.211885921 +0000 UTC m=+153.113943302" watchObservedRunningTime="2026-04-17 09:13:48.212500976 +0000 UTC m=+153.114558357" Apr 17 09:13:51.964257 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:13:51.964214 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-mw8fd" podUID="b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5" Apr 17 09:13:51.976425 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:13:51.976394 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-jmvr9" podUID="a0becf09-e2a4-4fea-a602-f69826ef0f66" Apr 17 09:13:52.189665 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:52.189637 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mw8fd" Apr 17 09:13:53.669819 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:13:53.669785 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-4h6v9" podUID="ea2ee429-d7fa-4703-99bd-5d963ebab30c" Apr 17 09:13:54.604856 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:54.604800 2578 patch_prober.go:28] interesting pod/image-registry-759cb445bc-ttncw container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 09:13:54.605010 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:54.604871 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-759cb445bc-ttncw" podUID="6936dfa6-312b-4b02-81ab-70c7ec72b7d4" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 09:13:56.275036 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.274938 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-gc4f5"] Apr 17 09:13:56.281074 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.281052 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gc4f5" Apr 17 09:13:56.283891 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.283683 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 17 09:13:56.283891 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.283813 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 17 09:13:56.283891 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.283851 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-shtcr\"" Apr 17 09:13:56.285272 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.285252 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 09:13:56.288541 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.288522 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-95wqk"] Apr 17 09:13:56.292214 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.292193 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-gc4f5"] Apr 17 09:13:56.292329 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.292316 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-95wqk" Apr 17 09:13:56.292525 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.292507 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-bt9x7"] Apr 17 09:13:56.295100 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.295080 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 17 09:13:56.295296 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.295281 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-gt6dn\"" Apr 17 09:13:56.295612 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.295592 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 17 09:13:56.295804 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.295788 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 17 09:13:56.296476 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.296461 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-bt9x7" Apr 17 09:13:56.298893 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.298875 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 09:13:56.299591 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.299573 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 09:13:56.299696 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.299679 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 09:13:56.299801 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.299788 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-xkdtg\"" Apr 17 09:13:56.309592 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.309570 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-95wqk"] Apr 17 09:13:56.381919 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.381889 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5e1ed79e-f41a-49ad-ab05-2843eaef7806-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-gc4f5\" (UID: \"5e1ed79e-f41a-49ad-ab05-2843eaef7806\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gc4f5" Apr 17 09:13:56.382079 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.381937 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrggs\" (UniqueName: \"kubernetes.io/projected/5e1ed79e-f41a-49ad-ab05-2843eaef7806-kube-api-access-nrggs\") pod \"openshift-state-metrics-9d44df66c-gc4f5\" (UID: \"5e1ed79e-f41a-49ad-ab05-2843eaef7806\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gc4f5" Apr 17 09:13:56.382079 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.382029 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5e1ed79e-f41a-49ad-ab05-2843eaef7806-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-gc4f5\" (UID: \"5e1ed79e-f41a-49ad-ab05-2843eaef7806\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gc4f5" Apr 17 09:13:56.382079 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.382064 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5e1ed79e-f41a-49ad-ab05-2843eaef7806-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-gc4f5\" (UID: \"5e1ed79e-f41a-49ad-ab05-2843eaef7806\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gc4f5" Apr 17 09:13:56.483568 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.483525 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/11b40bf5-a000-48d0-8a8c-3011e6e7249c-metrics-client-ca\") pod \"node-exporter-bt9x7\" (UID: \"11b40bf5-a000-48d0-8a8c-3011e6e7249c\") " pod="openshift-monitoring/node-exporter-bt9x7" Apr 17 09:13:56.483758 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.483605 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5e1ed79e-f41a-49ad-ab05-2843eaef7806-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-gc4f5\" (UID: \"5e1ed79e-f41a-49ad-ab05-2843eaef7806\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gc4f5" Apr 17 09:13:56.483758 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.483636 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n6gv\" (UniqueName: \"kubernetes.io/projected/fa4648bd-7936-480a-ab85-3130c7a997c6-kube-api-access-9n6gv\") pod \"kube-state-metrics-69db897b98-95wqk\" (UID: \"fa4648bd-7936-480a-ab85-3130c7a997c6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-95wqk" Apr 17 09:13:56.483904 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.483760 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nrggs\" (UniqueName: \"kubernetes.io/projected/5e1ed79e-f41a-49ad-ab05-2843eaef7806-kube-api-access-nrggs\") pod \"openshift-state-metrics-9d44df66c-gc4f5\" (UID: \"5e1ed79e-f41a-49ad-ab05-2843eaef7806\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gc4f5" Apr 17 09:13:56.483904 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.483795 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fa4648bd-7936-480a-ab85-3130c7a997c6-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-95wqk\" (UID: \"fa4648bd-7936-480a-ab85-3130c7a997c6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-95wqk" Apr 17 09:13:56.483904 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.483828 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/fa4648bd-7936-480a-ab85-3130c7a997c6-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-95wqk\" (UID: \"fa4648bd-7936-480a-ab85-3130c7a997c6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-95wqk" Apr 17 09:13:56.483904 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.483873 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5e1ed79e-f41a-49ad-ab05-2843eaef7806-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-gc4f5\" (UID: \"5e1ed79e-f41a-49ad-ab05-2843eaef7806\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gc4f5" Apr 17 09:13:56.483904 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.483900 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5e1ed79e-f41a-49ad-ab05-2843eaef7806-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-gc4f5\" (UID: \"5e1ed79e-f41a-49ad-ab05-2843eaef7806\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gc4f5" Apr 17 09:13:56.484153 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.483936 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/11b40bf5-a000-48d0-8a8c-3011e6e7249c-node-exporter-wtmp\") pod \"node-exporter-bt9x7\" (UID: \"11b40bf5-a000-48d0-8a8c-3011e6e7249c\") " pod="openshift-monitoring/node-exporter-bt9x7" Apr 17 09:13:56.484153 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.483975 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/11b40bf5-a000-48d0-8a8c-3011e6e7249c-root\") pod \"node-exporter-bt9x7\" (UID: \"11b40bf5-a000-48d0-8a8c-3011e6e7249c\") " pod="openshift-monitoring/node-exporter-bt9x7" Apr 17 09:13:56.484153 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.484008 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/11b40bf5-a000-48d0-8a8c-3011e6e7249c-node-exporter-accelerators-collector-config\") pod \"node-exporter-bt9x7\" (UID: \"11b40bf5-a000-48d0-8a8c-3011e6e7249c\") " pod="openshift-monitoring/node-exporter-bt9x7" Apr 17 09:13:56.484153 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.484038 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fa4648bd-7936-480a-ab85-3130c7a997c6-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-95wqk\" (UID: \"fa4648bd-7936-480a-ab85-3130c7a997c6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-95wqk" Apr 17 09:13:56.484153 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.484070 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/11b40bf5-a000-48d0-8a8c-3011e6e7249c-node-exporter-textfile\") pod \"node-exporter-bt9x7\" (UID: \"11b40bf5-a000-48d0-8a8c-3011e6e7249c\") " pod="openshift-monitoring/node-exporter-bt9x7" Apr 17 09:13:56.484153 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:13:56.484119 2578 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 17 09:13:56.484153 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.484108 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghw4b\" (UniqueName: \"kubernetes.io/projected/11b40bf5-a000-48d0-8a8c-3011e6e7249c-kube-api-access-ghw4b\") pod \"node-exporter-bt9x7\" (UID: \"11b40bf5-a000-48d0-8a8c-3011e6e7249c\") " pod="openshift-monitoring/node-exporter-bt9x7" Apr 17 09:13:56.484490 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.484161 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/11b40bf5-a000-48d0-8a8c-3011e6e7249c-node-exporter-tls\") pod \"node-exporter-bt9x7\" (UID: \"11b40bf5-a000-48d0-8a8c-3011e6e7249c\") " pod="openshift-monitoring/node-exporter-bt9x7" Apr 17 09:13:56.484490 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:13:56.484183 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e1ed79e-f41a-49ad-ab05-2843eaef7806-openshift-state-metrics-tls podName:5e1ed79e-f41a-49ad-ab05-2843eaef7806 nodeName:}" failed. No retries permitted until 2026-04-17 09:13:56.984163091 +0000 UTC m=+161.886220452 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/5e1ed79e-f41a-49ad-ab05-2843eaef7806-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-gc4f5" (UID: "5e1ed79e-f41a-49ad-ab05-2843eaef7806") : secret "openshift-state-metrics-tls" not found Apr 17 09:13:56.484601 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.484486 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/11b40bf5-a000-48d0-8a8c-3011e6e7249c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bt9x7\" (UID: \"11b40bf5-a000-48d0-8a8c-3011e6e7249c\") " pod="openshift-monitoring/node-exporter-bt9x7" Apr 17 09:13:56.484601 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.484526 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/fa4648bd-7936-480a-ab85-3130c7a997c6-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-95wqk\" (UID: \"fa4648bd-7936-480a-ab85-3130c7a997c6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-95wqk" Apr 17 09:13:56.484601 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.484559 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/fa4648bd-7936-480a-ab85-3130c7a997c6-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-95wqk\" (UID: \"fa4648bd-7936-480a-ab85-3130c7a997c6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-95wqk" Apr 17 09:13:56.484751 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.484601 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/11b40bf5-a000-48d0-8a8c-3011e6e7249c-sys\") pod \"node-exporter-bt9x7\" (UID: \"11b40bf5-a000-48d0-8a8c-3011e6e7249c\") " pod="openshift-monitoring/node-exporter-bt9x7" Apr 17 09:13:56.484851 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.484812 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5e1ed79e-f41a-49ad-ab05-2843eaef7806-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-gc4f5\" (UID: \"5e1ed79e-f41a-49ad-ab05-2843eaef7806\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gc4f5" Apr 17 09:13:56.487524 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.487499 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5e1ed79e-f41a-49ad-ab05-2843eaef7806-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-gc4f5\" (UID: \"5e1ed79e-f41a-49ad-ab05-2843eaef7806\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gc4f5" Apr 17 09:13:56.493702 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.493676 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrggs\" (UniqueName: \"kubernetes.io/projected/5e1ed79e-f41a-49ad-ab05-2843eaef7806-kube-api-access-nrggs\") pod \"openshift-state-metrics-9d44df66c-gc4f5\" (UID: \"5e1ed79e-f41a-49ad-ab05-2843eaef7806\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gc4f5" Apr 17 09:13:56.585725 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.585645 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/fa4648bd-7936-480a-ab85-3130c7a997c6-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-95wqk\" (UID: \"fa4648bd-7936-480a-ab85-3130c7a997c6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-95wqk" Apr 17 09:13:56.585725 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.585694 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/11b40bf5-a000-48d0-8a8c-3011e6e7249c-sys\") pod \"node-exporter-bt9x7\" (UID: \"11b40bf5-a000-48d0-8a8c-3011e6e7249c\") " pod="openshift-monitoring/node-exporter-bt9x7" Apr 17 09:13:56.585940 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.585741 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/11b40bf5-a000-48d0-8a8c-3011e6e7249c-sys\") pod \"node-exporter-bt9x7\" (UID: \"11b40bf5-a000-48d0-8a8c-3011e6e7249c\") " pod="openshift-monitoring/node-exporter-bt9x7" Apr 17 09:13:56.585940 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.585876 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/11b40bf5-a000-48d0-8a8c-3011e6e7249c-metrics-client-ca\") pod \"node-exporter-bt9x7\" (UID: \"11b40bf5-a000-48d0-8a8c-3011e6e7249c\") " pod="openshift-monitoring/node-exporter-bt9x7" Apr 17 09:13:56.586039 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.585937 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9n6gv\" (UniqueName: \"kubernetes.io/projected/fa4648bd-7936-480a-ab85-3130c7a997c6-kube-api-access-9n6gv\") pod \"kube-state-metrics-69db897b98-95wqk\" (UID: \"fa4648bd-7936-480a-ab85-3130c7a997c6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-95wqk" Apr 17 09:13:56.586039 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.585969 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fa4648bd-7936-480a-ab85-3130c7a997c6-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-95wqk\" (UID: \"fa4648bd-7936-480a-ab85-3130c7a997c6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-95wqk" Apr 17 09:13:56.586039 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.586005 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/fa4648bd-7936-480a-ab85-3130c7a997c6-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-95wqk\" (UID: \"fa4648bd-7936-480a-ab85-3130c7a997c6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-95wqk" Apr 17 09:13:56.586178 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.586060 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/11b40bf5-a000-48d0-8a8c-3011e6e7249c-node-exporter-wtmp\") pod \"node-exporter-bt9x7\" (UID: \"11b40bf5-a000-48d0-8a8c-3011e6e7249c\") " pod="openshift-monitoring/node-exporter-bt9x7" Apr 17 09:13:56.586178 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.586065 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/fa4648bd-7936-480a-ab85-3130c7a997c6-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-95wqk\" (UID: \"fa4648bd-7936-480a-ab85-3130c7a997c6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-95wqk" Apr 17 09:13:56.586178 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.586146 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/11b40bf5-a000-48d0-8a8c-3011e6e7249c-root\") pod \"node-exporter-bt9x7\" (UID: \"11b40bf5-a000-48d0-8a8c-3011e6e7249c\") " pod="openshift-monitoring/node-exporter-bt9x7" Apr 17 09:13:56.586315 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.586195 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/11b40bf5-a000-48d0-8a8c-3011e6e7249c-root\") pod \"node-exporter-bt9x7\" (UID: \"11b40bf5-a000-48d0-8a8c-3011e6e7249c\") " pod="openshift-monitoring/node-exporter-bt9x7" Apr 17 09:13:56.586315 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.586194 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/11b40bf5-a000-48d0-8a8c-3011e6e7249c-node-exporter-accelerators-collector-config\") pod \"node-exporter-bt9x7\" (UID: \"11b40bf5-a000-48d0-8a8c-3011e6e7249c\") " pod="openshift-monitoring/node-exporter-bt9x7" Apr 17 09:13:56.586315 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.586239 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fa4648bd-7936-480a-ab85-3130c7a997c6-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-95wqk\" (UID: \"fa4648bd-7936-480a-ab85-3130c7a997c6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-95wqk" Apr 17 09:13:56.586315 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.586270 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/11b40bf5-a000-48d0-8a8c-3011e6e7249c-node-exporter-textfile\") pod \"node-exporter-bt9x7\" (UID: \"11b40bf5-a000-48d0-8a8c-3011e6e7249c\") " pod="openshift-monitoring/node-exporter-bt9x7" Apr 17 09:13:56.586315 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.586307 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ghw4b\" (UniqueName: \"kubernetes.io/projected/11b40bf5-a000-48d0-8a8c-3011e6e7249c-kube-api-access-ghw4b\") pod \"node-exporter-bt9x7\" (UID: \"11b40bf5-a000-48d0-8a8c-3011e6e7249c\") " pod="openshift-monitoring/node-exporter-bt9x7" Apr 17 09:13:56.586600 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.586341 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/11b40bf5-a000-48d0-8a8c-3011e6e7249c-node-exporter-tls\") pod \"node-exporter-bt9x7\" (UID: \"11b40bf5-a000-48d0-8a8c-3011e6e7249c\") " pod="openshift-monitoring/node-exporter-bt9x7" Apr 17 09:13:56.586600 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.586371 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/11b40bf5-a000-48d0-8a8c-3011e6e7249c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bt9x7\" (UID: \"11b40bf5-a000-48d0-8a8c-3011e6e7249c\") " pod="openshift-monitoring/node-exporter-bt9x7" Apr 17 09:13:56.586600 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.586401 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/fa4648bd-7936-480a-ab85-3130c7a997c6-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-95wqk\" (UID: \"fa4648bd-7936-480a-ab85-3130c7a997c6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-95wqk" Apr 17 09:13:56.586600 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.586466 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/11b40bf5-a000-48d0-8a8c-3011e6e7249c-metrics-client-ca\") pod \"node-exporter-bt9x7\" (UID: \"11b40bf5-a000-48d0-8a8c-3011e6e7249c\") " pod="openshift-monitoring/node-exporter-bt9x7" Apr 17 09:13:56.586779 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.586648 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/11b40bf5-a000-48d0-8a8c-3011e6e7249c-node-exporter-accelerators-collector-config\") pod \"node-exporter-bt9x7\" (UID: \"11b40bf5-a000-48d0-8a8c-3011e6e7249c\") " pod="openshift-monitoring/node-exporter-bt9x7" Apr 17 09:13:56.586779 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:13:56.586741 2578 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 09:13:56.586906 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:13:56.586802 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11b40bf5-a000-48d0-8a8c-3011e6e7249c-node-exporter-tls podName:11b40bf5-a000-48d0-8a8c-3011e6e7249c nodeName:}" failed. No retries permitted until 2026-04-17 09:13:57.086784075 +0000 UTC m=+161.988841453 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/11b40bf5-a000-48d0-8a8c-3011e6e7249c-node-exporter-tls") pod "node-exporter-bt9x7" (UID: "11b40bf5-a000-48d0-8a8c-3011e6e7249c") : secret "node-exporter-tls" not found Apr 17 09:13:56.586906 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.586797 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fa4648bd-7936-480a-ab85-3130c7a997c6-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-95wqk\" (UID: \"fa4648bd-7936-480a-ab85-3130c7a997c6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-95wqk" Apr 17 09:13:56.586906 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.586807 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/fa4648bd-7936-480a-ab85-3130c7a997c6-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-95wqk\" (UID: \"fa4648bd-7936-480a-ab85-3130c7a997c6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-95wqk" Apr 17 09:13:56.587057 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.586961 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/11b40bf5-a000-48d0-8a8c-3011e6e7249c-node-exporter-wtmp\") pod \"node-exporter-bt9x7\" (UID: \"11b40bf5-a000-48d0-8a8c-3011e6e7249c\") " pod="openshift-monitoring/node-exporter-bt9x7" Apr 17 09:13:56.587250 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.587225 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/11b40bf5-a000-48d0-8a8c-3011e6e7249c-node-exporter-textfile\") pod \"node-exporter-bt9x7\" (UID: \"11b40bf5-a000-48d0-8a8c-3011e6e7249c\") " pod="openshift-monitoring/node-exporter-bt9x7" Apr 17 09:13:56.589215 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.589149 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fa4648bd-7936-480a-ab85-3130c7a997c6-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-95wqk\" (UID: \"fa4648bd-7936-480a-ab85-3130c7a997c6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-95wqk" Apr 17 09:13:56.589339 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.589309 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/fa4648bd-7936-480a-ab85-3130c7a997c6-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-95wqk\" (UID: \"fa4648bd-7936-480a-ab85-3130c7a997c6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-95wqk" Apr 17 09:13:56.589401 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.589377 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/11b40bf5-a000-48d0-8a8c-3011e6e7249c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bt9x7\" (UID: \"11b40bf5-a000-48d0-8a8c-3011e6e7249c\") " pod="openshift-monitoring/node-exporter-bt9x7" Apr 17 09:13:56.594997 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.594970 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghw4b\" (UniqueName: \"kubernetes.io/projected/11b40bf5-a000-48d0-8a8c-3011e6e7249c-kube-api-access-ghw4b\") pod \"node-exporter-bt9x7\" (UID: \"11b40bf5-a000-48d0-8a8c-3011e6e7249c\") " pod="openshift-monitoring/node-exporter-bt9x7" Apr 17 09:13:56.596185 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.596162 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n6gv\" (UniqueName: \"kubernetes.io/projected/fa4648bd-7936-480a-ab85-3130c7a997c6-kube-api-access-9n6gv\") pod \"kube-state-metrics-69db897b98-95wqk\" (UID: \"fa4648bd-7936-480a-ab85-3130c7a997c6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-95wqk" Apr 17 09:13:56.606830 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.606813 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-95wqk" Apr 17 09:13:56.742929 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.742897 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-95wqk"] Apr 17 09:13:56.745928 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:13:56.745901 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa4648bd_7936_480a_ab85_3130c7a997c6.slice/crio-4d3553c7216cf0d71514802882c29c4917ddfc05b70d502e744f715ceba2c3a9 WatchSource:0}: Error finding container 4d3553c7216cf0d71514802882c29c4917ddfc05b70d502e744f715ceba2c3a9: Status 404 returned error can't find the container with id 4d3553c7216cf0d71514802882c29c4917ddfc05b70d502e744f715ceba2c3a9 Apr 17 09:13:56.889309 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.889247 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0becf09-e2a4-4fea-a602-f69826ef0f66-cert\") pod \"ingress-canary-jmvr9\" (UID: \"a0becf09-e2a4-4fea-a602-f69826ef0f66\") " pod="openshift-ingress-canary/ingress-canary-jmvr9" Apr 17 09:13:56.889416 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.889316 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5-metrics-tls\") pod \"dns-default-mw8fd\" (UID: \"b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5\") " pod="openshift-dns/dns-default-mw8fd" Apr 17 09:13:56.891518 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.891494 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5-metrics-tls\") pod \"dns-default-mw8fd\" (UID: \"b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5\") " pod="openshift-dns/dns-default-mw8fd" Apr 17 09:13:56.891605 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.891554 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0becf09-e2a4-4fea-a602-f69826ef0f66-cert\") pod \"ingress-canary-jmvr9\" (UID: \"a0becf09-e2a4-4fea-a602-f69826ef0f66\") " pod="openshift-ingress-canary/ingress-canary-jmvr9" Apr 17 09:13:56.990097 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.990068 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5e1ed79e-f41a-49ad-ab05-2843eaef7806-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-gc4f5\" (UID: \"5e1ed79e-f41a-49ad-ab05-2843eaef7806\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gc4f5" Apr 17 09:13:56.992314 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.992292 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-hv2j9\"" Apr 17 09:13:56.992314 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:56.992305 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5e1ed79e-f41a-49ad-ab05-2843eaef7806-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-gc4f5\" (UID: \"5e1ed79e-f41a-49ad-ab05-2843eaef7806\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gc4f5" Apr 17 09:13:57.000727 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.000702 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mw8fd" Apr 17 09:13:57.090583 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.090556 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/11b40bf5-a000-48d0-8a8c-3011e6e7249c-node-exporter-tls\") pod \"node-exporter-bt9x7\" (UID: \"11b40bf5-a000-48d0-8a8c-3011e6e7249c\") " pod="openshift-monitoring/node-exporter-bt9x7" Apr 17 09:13:57.092746 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.092726 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/11b40bf5-a000-48d0-8a8c-3011e6e7249c-node-exporter-tls\") pod \"node-exporter-bt9x7\" (UID: \"11b40bf5-a000-48d0-8a8c-3011e6e7249c\") " pod="openshift-monitoring/node-exporter-bt9x7" Apr 17 09:13:57.119560 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.119540 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mw8fd"] Apr 17 09:13:57.121564 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:13:57.121542 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4cf6a67_48da_489c_8f76_ffb0dd4d5fe5.slice/crio-5397b31f9d9f0f21729d4cd46cf33ad6a8959d9c029725ea549ac590edef8436 WatchSource:0}: Error finding container 5397b31f9d9f0f21729d4cd46cf33ad6a8959d9c029725ea549ac590edef8436: Status 404 returned error can't find the container with id 5397b31f9d9f0f21729d4cd46cf33ad6a8959d9c029725ea549ac590edef8436 Apr 17 09:13:57.194802 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.194743 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gc4f5" Apr 17 09:13:57.207572 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.207545 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-95wqk" event={"ID":"fa4648bd-7936-480a-ab85-3130c7a997c6","Type":"ContainerStarted","Data":"4d3553c7216cf0d71514802882c29c4917ddfc05b70d502e744f715ceba2c3a9"} Apr 17 09:13:57.208570 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.208547 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mw8fd" event={"ID":"b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5","Type":"ContainerStarted","Data":"5397b31f9d9f0f21729d4cd46cf33ad6a8959d9c029725ea549ac590edef8436"} Apr 17 09:13:57.214790 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.214770 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-bt9x7" Apr 17 09:13:57.222800 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:13:57.222759 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11b40bf5_a000_48d0_8a8c_3011e6e7249c.slice/crio-2f2392060dff5ab733637407aed99e40a6184a423f1139452d097fd52cf7a56e WatchSource:0}: Error finding container 2f2392060dff5ab733637407aed99e40a6184a423f1139452d097fd52cf7a56e: Status 404 returned error can't find the container with id 2f2392060dff5ab733637407aed99e40a6184a423f1139452d097fd52cf7a56e Apr 17 09:13:57.327426 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.327395 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-gc4f5"] Apr 17 09:13:57.331241 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:13:57.331215 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e1ed79e_f41a_49ad_ab05_2843eaef7806.slice/crio-6d28fb319d72c2a0110bb7845f7a34d17c97f2fe29964ea192251828cad9445e WatchSource:0}: Error finding container 6d28fb319d72c2a0110bb7845f7a34d17c97f2fe29964ea192251828cad9445e: Status 404 returned error can't find the container with id 6d28fb319d72c2a0110bb7845f7a34d17c97f2fe29964ea192251828cad9445e Apr 17 09:13:57.357721 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.357697 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 09:13:57.362036 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.362015 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:13:57.365041 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.365019 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 09:13:57.365254 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.365239 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 09:13:57.365416 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.365399 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 09:13:57.365490 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.365436 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 09:13:57.365549 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.365401 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 09:13:57.365626 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.365610 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 09:13:57.366037 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.365616 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 09:13:57.366037 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.365695 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-gf2dg\"" Apr 17 09:13:57.368499 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.368477 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 09:13:57.368592 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.368580 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 09:13:57.382342 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.382324 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 09:13:57.394190 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.394063 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-config-volume\") pod \"alertmanager-main-0\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:13:57.394190 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.394109 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:13:57.394190 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.394139 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-web-config\") pod \"alertmanager-main-0\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:13:57.394439 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.394196 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:13:57.394439 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.394245 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:13:57.394439 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.394278 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:13:57.394439 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.394310 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:13:57.394439 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.394347 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:13:57.394439 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.394397 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:13:57.394439 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.394415 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:13:57.394439 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.394439 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpdf9\" (UniqueName: \"kubernetes.io/projected/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-kube-api-access-rpdf9\") pod \"alertmanager-main-0\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:13:57.394856 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.394463 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-config-out\") pod \"alertmanager-main-0\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:13:57.394856 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.394583 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:13:57.497854 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.495400 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-web-config\") pod \"alertmanager-main-0\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:13:57.497854 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.495446 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:13:57.497854 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.495519 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:13:57.497854 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.495576 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:13:57.497854 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.495645 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:13:57.497854 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.495710 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:13:57.497854 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.495800 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:13:57.497854 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.495933 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:13:57.497854 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.495972 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rpdf9\" (UniqueName: \"kubernetes.io/projected/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-kube-api-access-rpdf9\") pod \"alertmanager-main-0\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:13:57.497854 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.496025 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-config-out\") pod \"alertmanager-main-0\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:13:57.497854 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.496118 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:13:57.497854 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.496173 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-config-volume\") pod \"alertmanager-main-0\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:13:57.497854 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.496203 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:13:57.499461 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:13:57.498969 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-alertmanager-trusted-ca-bundle podName:ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5 nodeName:}" failed. No retries permitted until 2026-04-17 09:13:57.998945584 +0000 UTC m=+162.901002944 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5") : configmap references non-existent config key: ca-bundle.crt Apr 17 09:13:57.499667 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.499642 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:13:57.499751 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.499681 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:13:57.499944 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.499904 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:13:57.500085 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.500060 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:13:57.501738 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.501709 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:13:57.505299 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.502879 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:13:57.505299 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.503537 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:13:57.505299 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.503871 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:13:57.505299 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.504823 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-config-out\") pod \"alertmanager-main-0\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:13:57.505578 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.505414 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-config-volume\") pod \"alertmanager-main-0\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:13:57.506061 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.506039 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-web-config\") pod \"alertmanager-main-0\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:13:57.511736 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:57.511656 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpdf9\" (UniqueName: \"kubernetes.io/projected/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-kube-api-access-rpdf9\") pod \"alertmanager-main-0\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:13:58.001303 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:58.001192 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:13:58.002044 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:58.002018 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:13:58.212715 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:58.212682 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gc4f5" event={"ID":"5e1ed79e-f41a-49ad-ab05-2843eaef7806","Type":"ContainerStarted","Data":"0724d0ae0183dd1512e3d48fb926c3cf43b178472cb013cf18057d71f3632cdd"} Apr 17 09:13:58.212890 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:58.212723 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gc4f5" event={"ID":"5e1ed79e-f41a-49ad-ab05-2843eaef7806","Type":"ContainerStarted","Data":"163a87f6c3772d69d7540d683ad36c488a27d1694e1730b2a9489b13171a1450"} Apr 17 09:13:58.212890 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:58.212738 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gc4f5" event={"ID":"5e1ed79e-f41a-49ad-ab05-2843eaef7806","Type":"ContainerStarted","Data":"6d28fb319d72c2a0110bb7845f7a34d17c97f2fe29964ea192251828cad9445e"} Apr 17 09:13:58.213886 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:58.213864 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bt9x7" event={"ID":"11b40bf5-a000-48d0-8a8c-3011e6e7249c","Type":"ContainerStarted","Data":"2f2392060dff5ab733637407aed99e40a6184a423f1139452d097fd52cf7a56e"} Apr 17 09:13:58.288789 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:58.288711 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:13:59.093055 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:59.092828 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 09:13:59.097043 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:13:59.097010 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddfa3f44_ca1e_44c1_9dd2_8d20bc0a1bc5.slice/crio-1fb979e19d5f312c7a6d245e500275497750d15c2d8132755bd8e4bb9271419b WatchSource:0}: Error finding container 1fb979e19d5f312c7a6d245e500275497750d15c2d8132755bd8e4bb9271419b: Status 404 returned error can't find the container with id 1fb979e19d5f312c7a6d245e500275497750d15c2d8132755bd8e4bb9271419b Apr 17 09:13:59.236025 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:59.235954 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mw8fd" event={"ID":"b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5","Type":"ContainerStarted","Data":"f49272b3bddd748d077d39c1657179a5a06a9f8df4e4c85a21750056acdd615e"} Apr 17 09:13:59.240116 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:59.239474 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gc4f5" event={"ID":"5e1ed79e-f41a-49ad-ab05-2843eaef7806","Type":"ContainerStarted","Data":"20b2f70ced8dd37eda2a521cf03e5227176d54cd90aa943d462e39604c8d2ecf"} Apr 17 09:13:59.248804 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:59.248231 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-95wqk" event={"ID":"fa4648bd-7936-480a-ab85-3130c7a997c6","Type":"ContainerStarted","Data":"7ea51fa2361a5773c11b43a6344d589637855934c269f2effa36150325a8d7ee"} Apr 17 09:13:59.248804 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:59.248262 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-95wqk" event={"ID":"fa4648bd-7936-480a-ab85-3130c7a997c6","Type":"ContainerStarted","Data":"9a1df830dd1c2f6345c8e61107c10c2b9542ce992a6128430f00190e8e01e6ec"} Apr 17 09:13:59.252890 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:59.252807 2578 generic.go:358] "Generic (PLEG): container finished" podID="11b40bf5-a000-48d0-8a8c-3011e6e7249c" containerID="7173eb8c11bb67f18d6588f67eba88fadf23eb7f4c059197d5b1e4e271c2eb7d" exitCode=0 Apr 17 09:13:59.252999 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:59.252914 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bt9x7" event={"ID":"11b40bf5-a000-48d0-8a8c-3011e6e7249c","Type":"ContainerDied","Data":"7173eb8c11bb67f18d6588f67eba88fadf23eb7f4c059197d5b1e4e271c2eb7d"} Apr 17 09:13:59.255097 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:59.254667 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5","Type":"ContainerStarted","Data":"1fb979e19d5f312c7a6d245e500275497750d15c2d8132755bd8e4bb9271419b"} Apr 17 09:13:59.289106 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:13:59.289030 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gc4f5" podStartSLOduration=1.774049231 podStartE2EDuration="3.289009663s" podCreationTimestamp="2026-04-17 09:13:56 +0000 UTC" firstStartedPulling="2026-04-17 09:13:57.483954949 +0000 UTC m=+162.386012313" lastFinishedPulling="2026-04-17 09:13:58.99891538 +0000 UTC m=+163.900972745" observedRunningTime="2026-04-17 09:13:59.261713059 +0000 UTC m=+164.163770456" watchObservedRunningTime="2026-04-17 09:13:59.289009663 +0000 UTC m=+164.191067045" Apr 17 09:14:00.259403 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:00.259369 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-95wqk" event={"ID":"fa4648bd-7936-480a-ab85-3130c7a997c6","Type":"ContainerStarted","Data":"a9a81edf14287ab9b9c26a5c8b659d3f07034d65d9f88f244cee1bface7fac1e"} Apr 17 09:14:00.261166 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:00.261145 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bt9x7" event={"ID":"11b40bf5-a000-48d0-8a8c-3011e6e7249c","Type":"ContainerStarted","Data":"91078968f5d1404be2512e38da20abb4bb455f8b451e9cce267d7bf5d62cd31e"} Apr 17 09:14:00.261304 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:00.261170 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bt9x7" event={"ID":"11b40bf5-a000-48d0-8a8c-3011e6e7249c","Type":"ContainerStarted","Data":"d94b6c128bcb4f95fe7abda8c4f021a93710aa71fb2d85bbb248d7376af708a6"} Apr 17 09:14:00.262348 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:00.262322 2578 generic.go:358] "Generic (PLEG): container finished" podID="ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5" containerID="95d2c9f400a1813e68d46aff1d54d2170c6d6212991a3cb1b7babf6395ad9a1a" exitCode=0 Apr 17 09:14:00.262434 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:00.262413 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5","Type":"ContainerDied","Data":"95d2c9f400a1813e68d46aff1d54d2170c6d6212991a3cb1b7babf6395ad9a1a"} Apr 17 09:14:00.264024 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:00.263992 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mw8fd" event={"ID":"b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5","Type":"ContainerStarted","Data":"13242476890b135d415ec39d468d58e18ca25e314bb1efea78c112eec5e5318d"} Apr 17 09:14:00.278662 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:00.278619 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-95wqk" podStartSLOduration=2.078982562 podStartE2EDuration="4.278606445s" podCreationTimestamp="2026-04-17 09:13:56 +0000 UTC" firstStartedPulling="2026-04-17 09:13:56.74770973 +0000 UTC m=+161.649767089" lastFinishedPulling="2026-04-17 09:13:58.947333611 +0000 UTC m=+163.849390972" observedRunningTime="2026-04-17 09:14:00.277626721 +0000 UTC m=+165.179684104" watchObservedRunningTime="2026-04-17 09:14:00.278606445 +0000 UTC m=+165.180663839" Apr 17 09:14:00.333307 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:00.333267 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-mw8fd" podStartSLOduration=130.509392274 podStartE2EDuration="2m12.333253967s" podCreationTimestamp="2026-04-17 09:11:48 +0000 UTC" firstStartedPulling="2026-04-17 09:13:57.123471264 +0000 UTC m=+162.025528623" lastFinishedPulling="2026-04-17 09:13:58.947332951 +0000 UTC m=+163.849390316" observedRunningTime="2026-04-17 09:14:00.333237615 +0000 UTC m=+165.235294995" watchObservedRunningTime="2026-04-17 09:14:00.333253967 +0000 UTC m=+165.235311348" Apr 17 09:14:00.354916 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:00.354881 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-bt9x7" podStartSLOduration=2.632874744 podStartE2EDuration="4.354871192s" podCreationTimestamp="2026-04-17 09:13:56 +0000 UTC" firstStartedPulling="2026-04-17 09:13:57.225359181 +0000 UTC m=+162.127416540" lastFinishedPulling="2026-04-17 09:13:58.94735562 +0000 UTC m=+163.849412988" observedRunningTime="2026-04-17 09:14:00.354419511 +0000 UTC m=+165.256476904" watchObservedRunningTime="2026-04-17 09:14:00.354871192 +0000 UTC m=+165.256928573" Apr 17 09:14:00.585348 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:00.585319 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-884d56797-hgwvz"] Apr 17 09:14:00.588430 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:00.588413 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-884d56797-hgwvz" Apr 17 09:14:00.591462 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:00.591444 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 17 09:14:00.591462 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:00.591453 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 09:14:00.591586 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:00.591500 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 17 09:14:00.591586 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:00.591453 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 17 09:14:00.591770 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:00.591750 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-a0salsg2tpehs\"" Apr 17 09:14:00.591907 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:00.591782 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-x2zf9\"" Apr 17 09:14:00.597575 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:00.597549 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-884d56797-hgwvz"] Apr 17 09:14:00.627654 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:00.627626 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/714bba7b-0745-4cc4-8d29-654fd2dd34f3-secret-metrics-server-client-certs\") pod \"metrics-server-884d56797-hgwvz\" (UID: \"714bba7b-0745-4cc4-8d29-654fd2dd34f3\") " pod="openshift-monitoring/metrics-server-884d56797-hgwvz" Apr 17 09:14:00.627744 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:00.627662 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/714bba7b-0745-4cc4-8d29-654fd2dd34f3-client-ca-bundle\") pod \"metrics-server-884d56797-hgwvz\" (UID: \"714bba7b-0745-4cc4-8d29-654fd2dd34f3\") " pod="openshift-monitoring/metrics-server-884d56797-hgwvz" Apr 17 09:14:00.627744 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:00.627701 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftc6g\" (UniqueName: \"kubernetes.io/projected/714bba7b-0745-4cc4-8d29-654fd2dd34f3-kube-api-access-ftc6g\") pod \"metrics-server-884d56797-hgwvz\" (UID: \"714bba7b-0745-4cc4-8d29-654fd2dd34f3\") " pod="openshift-monitoring/metrics-server-884d56797-hgwvz" Apr 17 09:14:00.627825 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:00.627740 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/714bba7b-0745-4cc4-8d29-654fd2dd34f3-audit-log\") pod \"metrics-server-884d56797-hgwvz\" (UID: \"714bba7b-0745-4cc4-8d29-654fd2dd34f3\") " pod="openshift-monitoring/metrics-server-884d56797-hgwvz" Apr 17 09:14:00.627825 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:00.627764 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/714bba7b-0745-4cc4-8d29-654fd2dd34f3-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-884d56797-hgwvz\" (UID: \"714bba7b-0745-4cc4-8d29-654fd2dd34f3\") " pod="openshift-monitoring/metrics-server-884d56797-hgwvz" Apr 17 09:14:00.627825 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:00.627782 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/714bba7b-0745-4cc4-8d29-654fd2dd34f3-metrics-server-audit-profiles\") pod \"metrics-server-884d56797-hgwvz\" (UID: \"714bba7b-0745-4cc4-8d29-654fd2dd34f3\") " pod="openshift-monitoring/metrics-server-884d56797-hgwvz" Apr 17 09:14:00.627942 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:00.627854 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/714bba7b-0745-4cc4-8d29-654fd2dd34f3-secret-metrics-server-tls\") pod \"metrics-server-884d56797-hgwvz\" (UID: \"714bba7b-0745-4cc4-8d29-654fd2dd34f3\") " pod="openshift-monitoring/metrics-server-884d56797-hgwvz" Apr 17 09:14:00.728547 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:00.728512 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/714bba7b-0745-4cc4-8d29-654fd2dd34f3-secret-metrics-server-client-certs\") pod \"metrics-server-884d56797-hgwvz\" (UID: \"714bba7b-0745-4cc4-8d29-654fd2dd34f3\") " pod="openshift-monitoring/metrics-server-884d56797-hgwvz" Apr 17 09:14:00.728708 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:00.728556 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/714bba7b-0745-4cc4-8d29-654fd2dd34f3-client-ca-bundle\") pod \"metrics-server-884d56797-hgwvz\" (UID: \"714bba7b-0745-4cc4-8d29-654fd2dd34f3\") " pod="openshift-monitoring/metrics-server-884d56797-hgwvz" Apr 17 09:14:00.728708 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:00.728606 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ftc6g\" (UniqueName: \"kubernetes.io/projected/714bba7b-0745-4cc4-8d29-654fd2dd34f3-kube-api-access-ftc6g\") pod \"metrics-server-884d56797-hgwvz\" (UID: \"714bba7b-0745-4cc4-8d29-654fd2dd34f3\") " pod="openshift-monitoring/metrics-server-884d56797-hgwvz" Apr 17 09:14:00.728708 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:00.728634 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/714bba7b-0745-4cc4-8d29-654fd2dd34f3-audit-log\") pod \"metrics-server-884d56797-hgwvz\" (UID: \"714bba7b-0745-4cc4-8d29-654fd2dd34f3\") " pod="openshift-monitoring/metrics-server-884d56797-hgwvz" Apr 17 09:14:00.728708 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:00.728665 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/714bba7b-0745-4cc4-8d29-654fd2dd34f3-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-884d56797-hgwvz\" (UID: \"714bba7b-0745-4cc4-8d29-654fd2dd34f3\") " pod="openshift-monitoring/metrics-server-884d56797-hgwvz" Apr 17 09:14:00.728708 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:00.728696 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/714bba7b-0745-4cc4-8d29-654fd2dd34f3-metrics-server-audit-profiles\") pod \"metrics-server-884d56797-hgwvz\" (UID: \"714bba7b-0745-4cc4-8d29-654fd2dd34f3\") " pod="openshift-monitoring/metrics-server-884d56797-hgwvz" Apr 17 09:14:00.728986 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:00.728757 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/714bba7b-0745-4cc4-8d29-654fd2dd34f3-secret-metrics-server-tls\") pod \"metrics-server-884d56797-hgwvz\" (UID: \"714bba7b-0745-4cc4-8d29-654fd2dd34f3\") " pod="openshift-monitoring/metrics-server-884d56797-hgwvz" Apr 17 09:14:00.729168 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:00.729131 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/714bba7b-0745-4cc4-8d29-654fd2dd34f3-audit-log\") pod \"metrics-server-884d56797-hgwvz\" (UID: \"714bba7b-0745-4cc4-8d29-654fd2dd34f3\") " pod="openshift-monitoring/metrics-server-884d56797-hgwvz" Apr 17 09:14:00.729545 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:00.729503 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/714bba7b-0745-4cc4-8d29-654fd2dd34f3-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-884d56797-hgwvz\" (UID: \"714bba7b-0745-4cc4-8d29-654fd2dd34f3\") " pod="openshift-monitoring/metrics-server-884d56797-hgwvz" Apr 17 09:14:00.730030 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:00.730006 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/714bba7b-0745-4cc4-8d29-654fd2dd34f3-metrics-server-audit-profiles\") pod \"metrics-server-884d56797-hgwvz\" (UID: \"714bba7b-0745-4cc4-8d29-654fd2dd34f3\") " pod="openshift-monitoring/metrics-server-884d56797-hgwvz" Apr 17 09:14:00.731297 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:00.731272 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/714bba7b-0745-4cc4-8d29-654fd2dd34f3-client-ca-bundle\") pod \"metrics-server-884d56797-hgwvz\" (UID: \"714bba7b-0745-4cc4-8d29-654fd2dd34f3\") " pod="openshift-monitoring/metrics-server-884d56797-hgwvz" Apr 17 09:14:00.731399 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:00.731382 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/714bba7b-0745-4cc4-8d29-654fd2dd34f3-secret-metrics-server-client-certs\") pod \"metrics-server-884d56797-hgwvz\" (UID: \"714bba7b-0745-4cc4-8d29-654fd2dd34f3\") " pod="openshift-monitoring/metrics-server-884d56797-hgwvz" Apr 17 09:14:00.731484 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:00.731468 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/714bba7b-0745-4cc4-8d29-654fd2dd34f3-secret-metrics-server-tls\") pod \"metrics-server-884d56797-hgwvz\" (UID: \"714bba7b-0745-4cc4-8d29-654fd2dd34f3\") " pod="openshift-monitoring/metrics-server-884d56797-hgwvz" Apr 17 09:14:00.737820 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:00.737798 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftc6g\" (UniqueName: \"kubernetes.io/projected/714bba7b-0745-4cc4-8d29-654fd2dd34f3-kube-api-access-ftc6g\") pod \"metrics-server-884d56797-hgwvz\" (UID: \"714bba7b-0745-4cc4-8d29-654fd2dd34f3\") " pod="openshift-monitoring/metrics-server-884d56797-hgwvz" Apr 17 09:14:00.897226 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:00.897143 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-884d56797-hgwvz" Apr 17 09:14:01.037027 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:01.036994 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-884d56797-hgwvz"] Apr 17 09:14:01.041015 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:14:01.040989 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod714bba7b_0745_4cc4_8d29_654fd2dd34f3.slice/crio-c816a2fd7ab9801484b2291ea84070158e9bbf38832cd9e00a1dc2b949202c01 WatchSource:0}: Error finding container c816a2fd7ab9801484b2291ea84070158e9bbf38832cd9e00a1dc2b949202c01: Status 404 returned error can't find the container with id c816a2fd7ab9801484b2291ea84070158e9bbf38832cd9e00a1dc2b949202c01 Apr 17 09:14:01.062726 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:01.062688 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-g424c"] Apr 17 09:14:01.069890 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:01.069867 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-g424c" Apr 17 09:14:01.073007 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:01.072785 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 17 09:14:01.073007 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:01.072820 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-654zm\"" Apr 17 09:14:01.080355 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:01.080337 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-g424c"] Apr 17 09:14:01.132344 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:01.132272 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1ed627b8-2f3b-4512-bf7a-1110aa6c22b5-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-g424c\" (UID: \"1ed627b8-2f3b-4512-bf7a-1110aa6c22b5\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-g424c" Apr 17 09:14:01.233172 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:01.233142 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1ed627b8-2f3b-4512-bf7a-1110aa6c22b5-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-g424c\" (UID: \"1ed627b8-2f3b-4512-bf7a-1110aa6c22b5\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-g424c" Apr 17 09:14:01.233314 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:14:01.233294 2578 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 17 09:14:01.233369 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:14:01.233358 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ed627b8-2f3b-4512-bf7a-1110aa6c22b5-monitoring-plugin-cert podName:1ed627b8-2f3b-4512-bf7a-1110aa6c22b5 nodeName:}" failed. No retries permitted until 2026-04-17 09:14:01.733341828 +0000 UTC m=+166.635399187 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/1ed627b8-2f3b-4512-bf7a-1110aa6c22b5-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-g424c" (UID: "1ed627b8-2f3b-4512-bf7a-1110aa6c22b5") : secret "monitoring-plugin-cert" not found Apr 17 09:14:01.268274 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:01.268235 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-884d56797-hgwvz" event={"ID":"714bba7b-0745-4cc4-8d29-654fd2dd34f3","Type":"ContainerStarted","Data":"c816a2fd7ab9801484b2291ea84070158e9bbf38832cd9e00a1dc2b949202c01"} Apr 17 09:14:01.268664 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:01.268411 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-mw8fd" Apr 17 09:14:01.738521 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:01.738499 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1ed627b8-2f3b-4512-bf7a-1110aa6c22b5-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-g424c\" (UID: \"1ed627b8-2f3b-4512-bf7a-1110aa6c22b5\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-g424c" Apr 17 09:14:01.740806 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:01.740784 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1ed627b8-2f3b-4512-bf7a-1110aa6c22b5-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-g424c\" (UID: \"1ed627b8-2f3b-4512-bf7a-1110aa6c22b5\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-g424c" Apr 17 09:14:01.981965 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:01.981933 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-g424c" Apr 17 09:14:02.108060 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:02.108038 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-g424c"] Apr 17 09:14:02.111401 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:14:02.111361 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ed627b8_2f3b_4512_bf7a_1110aa6c22b5.slice/crio-c22a3aa44548f25e8c2099a1cd83cfd0c2f28b60fd4dc8f7e4757f9d54624f1d WatchSource:0}: Error finding container c22a3aa44548f25e8c2099a1cd83cfd0c2f28b60fd4dc8f7e4757f9d54624f1d: Status 404 returned error can't find the container with id c22a3aa44548f25e8c2099a1cd83cfd0c2f28b60fd4dc8f7e4757f9d54624f1d Apr 17 09:14:02.275313 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:02.275281 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5","Type":"ContainerStarted","Data":"c1c28e44c8c3d702ccb9a0f5d62b86d7dab4b56aa01ab5ab2523ee3389ab6ab8"} Apr 17 09:14:02.275683 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:02.275319 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5","Type":"ContainerStarted","Data":"339a7513979215caf825c520a08adc3d4f58ba03d2d5324f7bd2d08e2e0f7d95"} Apr 17 09:14:02.275683 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:02.275335 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5","Type":"ContainerStarted","Data":"324ce7296e0b468dd4daf7602dd2e772b9c33e78d99ece606d05af311bb413a8"} Apr 17 09:14:02.275683 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:02.275370 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5","Type":"ContainerStarted","Data":"67f3aba96ba48696a3e4ed7db58e88192134089d91b0e5b270d968a855daf155"} Apr 17 09:14:02.275683 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:02.275380 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5","Type":"ContainerStarted","Data":"bac4e1f2357d95cc5b6737ba8179c717eabcc4214c1ac47a343b42dceb719c61"} Apr 17 09:14:02.277053 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:02.277026 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-g424c" event={"ID":"1ed627b8-2f3b-4512-bf7a-1110aa6c22b5","Type":"ContainerStarted","Data":"c22a3aa44548f25e8c2099a1cd83cfd0c2f28b60fd4dc8f7e4757f9d54624f1d"} Apr 17 09:14:02.655864 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:02.655766 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jmvr9" Apr 17 09:14:02.658739 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:02.658713 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-hdtpc\"" Apr 17 09:14:02.666258 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:02.666239 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jmvr9" Apr 17 09:14:02.810750 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:02.810600 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jmvr9"] Apr 17 09:14:02.812857 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:14:02.812817 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0becf09_e2a4_4fea_a602_f69826ef0f66.slice/crio-76f88c843bf5db92b6498a226239e19801215346c56bcc6d4ca9a8a706ff8a6d WatchSource:0}: Error finding container 76f88c843bf5db92b6498a226239e19801215346c56bcc6d4ca9a8a706ff8a6d: Status 404 returned error can't find the container with id 76f88c843bf5db92b6498a226239e19801215346c56bcc6d4ca9a8a706ff8a6d Apr 17 09:14:03.282434 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:03.282331 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-884d56797-hgwvz" event={"ID":"714bba7b-0745-4cc4-8d29-654fd2dd34f3","Type":"ContainerStarted","Data":"9f5d48320b36ec79eb5254a7aaad6a2b8bc7771aed4c3a67dc90d7f2b6d9717d"} Apr 17 09:14:03.283893 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:03.283863 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jmvr9" event={"ID":"a0becf09-e2a4-4fea-a602-f69826ef0f66","Type":"ContainerStarted","Data":"76f88c843bf5db92b6498a226239e19801215346c56bcc6d4ca9a8a706ff8a6d"} Apr 17 09:14:03.287523 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:03.287498 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5","Type":"ContainerStarted","Data":"df5cc47276226d0fc84cd75238a101169c99ec536958e93d79b3d0cefebbca6d"} Apr 17 09:14:03.303006 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:03.302963 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-884d56797-hgwvz" podStartSLOduration=1.680291392 podStartE2EDuration="3.302951661s" podCreationTimestamp="2026-04-17 09:14:00 +0000 UTC" firstStartedPulling="2026-04-17 09:14:01.043139133 +0000 UTC m=+165.945196492" lastFinishedPulling="2026-04-17 09:14:02.665799399 +0000 UTC m=+167.567856761" observedRunningTime="2026-04-17 09:14:03.301548211 +0000 UTC m=+168.203605594" watchObservedRunningTime="2026-04-17 09:14:03.302951661 +0000 UTC m=+168.205009041" Apr 17 09:14:03.331040 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:03.330989 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.424459724 podStartE2EDuration="6.330973683s" podCreationTimestamp="2026-04-17 09:13:57 +0000 UTC" firstStartedPulling="2026-04-17 09:13:59.099674828 +0000 UTC m=+164.001732192" lastFinishedPulling="2026-04-17 09:14:03.006188787 +0000 UTC m=+167.908246151" observedRunningTime="2026-04-17 09:14:03.329257673 +0000 UTC m=+168.231315054" watchObservedRunningTime="2026-04-17 09:14:03.330973683 +0000 UTC m=+168.233031061" Apr 17 09:14:04.604309 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:04.604286 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-759cb445bc-ttncw" Apr 17 09:14:05.295489 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:05.295456 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-g424c" event={"ID":"1ed627b8-2f3b-4512-bf7a-1110aa6c22b5","Type":"ContainerStarted","Data":"93ff5b7d89039daa274370435b7375d664b90d8901e24ab95799d54a292bc1e3"} Apr 17 09:14:05.295694 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:05.295637 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-g424c" Apr 17 09:14:05.296853 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:05.296818 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jmvr9" event={"ID":"a0becf09-e2a4-4fea-a602-f69826ef0f66","Type":"ContainerStarted","Data":"4df1e990caedfd90cd5ec24392110abfc45b6ae2b6a560f2fec40a277bab9cce"} Apr 17 09:14:05.300798 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:05.300782 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-g424c" Apr 17 09:14:05.311711 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:05.311674 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-g424c" podStartSLOduration=1.8324102610000002 podStartE2EDuration="4.311660753s" podCreationTimestamp="2026-04-17 09:14:01 +0000 UTC" firstStartedPulling="2026-04-17 09:14:02.114328592 +0000 UTC m=+167.016385954" lastFinishedPulling="2026-04-17 09:14:04.593579084 +0000 UTC m=+169.495636446" observedRunningTime="2026-04-17 09:14:05.310171932 +0000 UTC m=+170.212229305" watchObservedRunningTime="2026-04-17 09:14:05.311660753 +0000 UTC m=+170.213718134" Apr 17 09:14:05.326090 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:05.326047 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-jmvr9" podStartSLOduration=135.503379528 podStartE2EDuration="2m17.326037212s" podCreationTimestamp="2026-04-17 09:11:48 +0000 UTC" firstStartedPulling="2026-04-17 09:14:02.815679194 +0000 UTC m=+167.717736553" lastFinishedPulling="2026-04-17 09:14:04.638336867 +0000 UTC m=+169.540394237" observedRunningTime="2026-04-17 09:14:05.325006727 +0000 UTC m=+170.227064110" watchObservedRunningTime="2026-04-17 09:14:05.326037212 +0000 UTC m=+170.228094589" Apr 17 09:14:07.655177 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:07.655144 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4h6v9" Apr 17 09:14:09.618598 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:09.618559 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-759cb445bc-ttncw" podUID="6936dfa6-312b-4b02-81ab-70c7ec72b7d4" containerName="registry" containerID="cri-o://659e490c2069fc320272a44cfdb461ca41de67e781c0852c8090831fb464f1e9" gracePeriod=30 Apr 17 09:14:09.857130 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:09.857108 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-759cb445bc-ttncw" Apr 17 09:14:09.907651 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:09.907587 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-registry-certificates\") pod \"6936dfa6-312b-4b02-81ab-70c7ec72b7d4\" (UID: \"6936dfa6-312b-4b02-81ab-70c7ec72b7d4\") " Apr 17 09:14:09.907651 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:09.907619 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-trusted-ca\") pod \"6936dfa6-312b-4b02-81ab-70c7ec72b7d4\" (UID: \"6936dfa6-312b-4b02-81ab-70c7ec72b7d4\") " Apr 17 09:14:09.907651 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:09.907650 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-bound-sa-token\") pod \"6936dfa6-312b-4b02-81ab-70c7ec72b7d4\" (UID: \"6936dfa6-312b-4b02-81ab-70c7ec72b7d4\") " Apr 17 09:14:09.907889 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:09.907687 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-ca-trust-extracted\") pod \"6936dfa6-312b-4b02-81ab-70c7ec72b7d4\" (UID: \"6936dfa6-312b-4b02-81ab-70c7ec72b7d4\") " Apr 17 09:14:09.907889 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:09.907706 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-installation-pull-secrets\") pod \"6936dfa6-312b-4b02-81ab-70c7ec72b7d4\" (UID: \"6936dfa6-312b-4b02-81ab-70c7ec72b7d4\") " Apr 17 09:14:09.907889 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:09.907726 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-image-registry-private-configuration\") pod \"6936dfa6-312b-4b02-81ab-70c7ec72b7d4\" (UID: \"6936dfa6-312b-4b02-81ab-70c7ec72b7d4\") " Apr 17 09:14:09.907889 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:09.907743 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-registry-tls\") pod \"6936dfa6-312b-4b02-81ab-70c7ec72b7d4\" (UID: \"6936dfa6-312b-4b02-81ab-70c7ec72b7d4\") " Apr 17 09:14:09.907889 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:09.907779 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbqts\" (UniqueName: \"kubernetes.io/projected/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-kube-api-access-fbqts\") pod \"6936dfa6-312b-4b02-81ab-70c7ec72b7d4\" (UID: \"6936dfa6-312b-4b02-81ab-70c7ec72b7d4\") " Apr 17 09:14:09.908138 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:09.908091 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "6936dfa6-312b-4b02-81ab-70c7ec72b7d4" (UID: "6936dfa6-312b-4b02-81ab-70c7ec72b7d4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 09:14:09.908138 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:09.908099 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "6936dfa6-312b-4b02-81ab-70c7ec72b7d4" (UID: "6936dfa6-312b-4b02-81ab-70c7ec72b7d4"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 09:14:09.910265 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:09.910212 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "6936dfa6-312b-4b02-81ab-70c7ec72b7d4" (UID: "6936dfa6-312b-4b02-81ab-70c7ec72b7d4"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 09:14:09.910373 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:09.910313 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "6936dfa6-312b-4b02-81ab-70c7ec72b7d4" (UID: "6936dfa6-312b-4b02-81ab-70c7ec72b7d4"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 09:14:09.910432 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:09.910416 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "6936dfa6-312b-4b02-81ab-70c7ec72b7d4" (UID: "6936dfa6-312b-4b02-81ab-70c7ec72b7d4"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 09:14:09.910487 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:09.910445 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-kube-api-access-fbqts" (OuterVolumeSpecName: "kube-api-access-fbqts") pod "6936dfa6-312b-4b02-81ab-70c7ec72b7d4" (UID: "6936dfa6-312b-4b02-81ab-70c7ec72b7d4"). InnerVolumeSpecName "kube-api-access-fbqts". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 09:14:09.910487 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:09.910452 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "6936dfa6-312b-4b02-81ab-70c7ec72b7d4" (UID: "6936dfa6-312b-4b02-81ab-70c7ec72b7d4"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 09:14:09.915457 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:09.915425 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "6936dfa6-312b-4b02-81ab-70c7ec72b7d4" (UID: "6936dfa6-312b-4b02-81ab-70c7ec72b7d4"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 09:14:10.009082 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:10.009060 2578 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-ca-trust-extracted\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:14:10.009082 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:10.009080 2578 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-installation-pull-secrets\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:14:10.009205 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:10.009092 2578 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-image-registry-private-configuration\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:14:10.009205 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:10.009103 2578 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-registry-tls\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:14:10.009205 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:10.009112 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fbqts\" (UniqueName: \"kubernetes.io/projected/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-kube-api-access-fbqts\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:14:10.009205 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:10.009120 2578 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-registry-certificates\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:14:10.009205 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:10.009128 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-trusted-ca\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:14:10.009205 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:10.009137 2578 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6936dfa6-312b-4b02-81ab-70c7ec72b7d4-bound-sa-token\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:14:10.310725 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:10.310697 2578 generic.go:358] "Generic (PLEG): container finished" podID="6936dfa6-312b-4b02-81ab-70c7ec72b7d4" containerID="659e490c2069fc320272a44cfdb461ca41de67e781c0852c8090831fb464f1e9" exitCode=0 Apr 17 09:14:10.310877 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:10.310757 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-759cb445bc-ttncw" Apr 17 09:14:10.310877 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:10.310762 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-759cb445bc-ttncw" event={"ID":"6936dfa6-312b-4b02-81ab-70c7ec72b7d4","Type":"ContainerDied","Data":"659e490c2069fc320272a44cfdb461ca41de67e781c0852c8090831fb464f1e9"} Apr 17 09:14:10.310877 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:10.310863 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-759cb445bc-ttncw" event={"ID":"6936dfa6-312b-4b02-81ab-70c7ec72b7d4","Type":"ContainerDied","Data":"519ddfcec1e507f758035ee60491c9318d1181632b3a266ab6ff0c1be7ac8af7"} Apr 17 09:14:10.310999 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:10.310879 2578 scope.go:117] "RemoveContainer" containerID="659e490c2069fc320272a44cfdb461ca41de67e781c0852c8090831fb464f1e9" Apr 17 09:14:10.319267 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:10.319249 2578 scope.go:117] "RemoveContainer" containerID="659e490c2069fc320272a44cfdb461ca41de67e781c0852c8090831fb464f1e9" Apr 17 09:14:10.319510 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:14:10.319485 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"659e490c2069fc320272a44cfdb461ca41de67e781c0852c8090831fb464f1e9\": container with ID starting with 659e490c2069fc320272a44cfdb461ca41de67e781c0852c8090831fb464f1e9 not found: ID does not exist" containerID="659e490c2069fc320272a44cfdb461ca41de67e781c0852c8090831fb464f1e9" Apr 17 09:14:10.319602 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:10.319514 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"659e490c2069fc320272a44cfdb461ca41de67e781c0852c8090831fb464f1e9"} err="failed to get container status \"659e490c2069fc320272a44cfdb461ca41de67e781c0852c8090831fb464f1e9\": rpc error: code = NotFound desc = could not find container \"659e490c2069fc320272a44cfdb461ca41de67e781c0852c8090831fb464f1e9\": container with ID starting with 659e490c2069fc320272a44cfdb461ca41de67e781c0852c8090831fb464f1e9 not found: ID does not exist" Apr 17 09:14:10.333018 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:10.332998 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-759cb445bc-ttncw"] Apr 17 09:14:10.336342 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:10.336322 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-759cb445bc-ttncw"] Apr 17 09:14:11.279555 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:11.279525 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-mw8fd" Apr 17 09:14:11.659632 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:11.659558 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6936dfa6-312b-4b02-81ab-70c7ec72b7d4" path="/var/lib/kubelet/pods/6936dfa6-312b-4b02-81ab-70c7ec72b7d4/volumes" Apr 17 09:14:11.765803 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:11.765774 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-knk5v"] Apr 17 09:14:11.766084 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:11.766071 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6936dfa6-312b-4b02-81ab-70c7ec72b7d4" containerName="registry" Apr 17 09:14:11.766128 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:11.766085 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="6936dfa6-312b-4b02-81ab-70c7ec72b7d4" containerName="registry" Apr 17 09:14:11.766158 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:11.766137 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="6936dfa6-312b-4b02-81ab-70c7ec72b7d4" containerName="registry" Apr 17 09:14:11.770542 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:11.770526 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-knk5v" Apr 17 09:14:11.773132 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:11.773111 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 09:14:11.773265 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:11.773245 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 09:14:11.773326 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:11.773301 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-b5zjb\"" Apr 17 09:14:11.778294 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:11.778274 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-knk5v"] Apr 17 09:14:11.820584 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:11.820562 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88tbh\" (UniqueName: \"kubernetes.io/projected/276b809a-2684-4b9d-9c50-dcc32d5cbe03-kube-api-access-88tbh\") pod \"downloads-6bcc868b7-knk5v\" (UID: \"276b809a-2684-4b9d-9c50-dcc32d5cbe03\") " pod="openshift-console/downloads-6bcc868b7-knk5v" Apr 17 09:14:11.921436 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:11.921372 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-88tbh\" (UniqueName: \"kubernetes.io/projected/276b809a-2684-4b9d-9c50-dcc32d5cbe03-kube-api-access-88tbh\") pod \"downloads-6bcc868b7-knk5v\" (UID: \"276b809a-2684-4b9d-9c50-dcc32d5cbe03\") " pod="openshift-console/downloads-6bcc868b7-knk5v" Apr 17 09:14:11.931690 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:11.931661 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-88tbh\" (UniqueName: \"kubernetes.io/projected/276b809a-2684-4b9d-9c50-dcc32d5cbe03-kube-api-access-88tbh\") pod \"downloads-6bcc868b7-knk5v\" (UID: \"276b809a-2684-4b9d-9c50-dcc32d5cbe03\") " pod="openshift-console/downloads-6bcc868b7-knk5v" Apr 17 09:14:12.080502 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:12.080473 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-knk5v" Apr 17 09:14:12.196284 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:12.196222 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-knk5v"] Apr 17 09:14:12.199078 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:14:12.199044 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod276b809a_2684_4b9d_9c50_dcc32d5cbe03.slice/crio-6ae91d31180eeecfaa90e599e62e3e22912d439e475c8995587005ccb5a6e07e WatchSource:0}: Error finding container 6ae91d31180eeecfaa90e599e62e3e22912d439e475c8995587005ccb5a6e07e: Status 404 returned error can't find the container with id 6ae91d31180eeecfaa90e599e62e3e22912d439e475c8995587005ccb5a6e07e Apr 17 09:14:12.322276 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:12.322241 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-knk5v" event={"ID":"276b809a-2684-4b9d-9c50-dcc32d5cbe03","Type":"ContainerStarted","Data":"6ae91d31180eeecfaa90e599e62e3e22912d439e475c8995587005ccb5a6e07e"} Apr 17 09:14:20.897852 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:20.897811 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-884d56797-hgwvz" Apr 17 09:14:20.898209 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:20.897871 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-884d56797-hgwvz" Apr 17 09:14:21.886166 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:21.886130 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7b989f78c4-xs2tx"] Apr 17 09:14:21.889716 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:21.889696 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b989f78c4-xs2tx" Apr 17 09:14:21.893963 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:21.893938 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 09:14:21.894099 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:21.893963 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 09:14:21.894099 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:21.893994 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-f9k9v\"" Apr 17 09:14:21.894099 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:21.894010 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 09:14:21.894099 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:21.894057 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 09:14:21.894330 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:21.894211 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 09:14:21.901061 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:21.901040 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b989f78c4-xs2tx"] Apr 17 09:14:22.016035 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:22.015998 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d8c0aa7a-a320-46fc-81bb-557c14ad8572-console-oauth-config\") pod \"console-7b989f78c4-xs2tx\" (UID: \"d8c0aa7a-a320-46fc-81bb-557c14ad8572\") " pod="openshift-console/console-7b989f78c4-xs2tx" Apr 17 09:14:22.016199 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:22.016090 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65mvc\" (UniqueName: \"kubernetes.io/projected/d8c0aa7a-a320-46fc-81bb-557c14ad8572-kube-api-access-65mvc\") pod \"console-7b989f78c4-xs2tx\" (UID: \"d8c0aa7a-a320-46fc-81bb-557c14ad8572\") " pod="openshift-console/console-7b989f78c4-xs2tx" Apr 17 09:14:22.016199 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:22.016132 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d8c0aa7a-a320-46fc-81bb-557c14ad8572-oauth-serving-cert\") pod \"console-7b989f78c4-xs2tx\" (UID: \"d8c0aa7a-a320-46fc-81bb-557c14ad8572\") " pod="openshift-console/console-7b989f78c4-xs2tx" Apr 17 09:14:22.016199 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:22.016166 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d8c0aa7a-a320-46fc-81bb-557c14ad8572-console-serving-cert\") pod \"console-7b989f78c4-xs2tx\" (UID: \"d8c0aa7a-a320-46fc-81bb-557c14ad8572\") " pod="openshift-console/console-7b989f78c4-xs2tx" Apr 17 09:14:22.016357 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:22.016206 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d8c0aa7a-a320-46fc-81bb-557c14ad8572-console-config\") pod \"console-7b989f78c4-xs2tx\" (UID: \"d8c0aa7a-a320-46fc-81bb-557c14ad8572\") " pod="openshift-console/console-7b989f78c4-xs2tx" Apr 17 09:14:22.016357 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:22.016306 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d8c0aa7a-a320-46fc-81bb-557c14ad8572-service-ca\") pod \"console-7b989f78c4-xs2tx\" (UID: \"d8c0aa7a-a320-46fc-81bb-557c14ad8572\") " pod="openshift-console/console-7b989f78c4-xs2tx" Apr 17 09:14:22.118006 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:22.117964 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d8c0aa7a-a320-46fc-81bb-557c14ad8572-oauth-serving-cert\") pod \"console-7b989f78c4-xs2tx\" (UID: \"d8c0aa7a-a320-46fc-81bb-557c14ad8572\") " pod="openshift-console/console-7b989f78c4-xs2tx" Apr 17 09:14:22.118178 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:22.118049 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d8c0aa7a-a320-46fc-81bb-557c14ad8572-console-serving-cert\") pod \"console-7b989f78c4-xs2tx\" (UID: \"d8c0aa7a-a320-46fc-81bb-557c14ad8572\") " pod="openshift-console/console-7b989f78c4-xs2tx" Apr 17 09:14:22.118178 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:22.118167 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d8c0aa7a-a320-46fc-81bb-557c14ad8572-console-config\") pod \"console-7b989f78c4-xs2tx\" (UID: \"d8c0aa7a-a320-46fc-81bb-557c14ad8572\") " pod="openshift-console/console-7b989f78c4-xs2tx" Apr 17 09:14:22.118342 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:22.118255 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d8c0aa7a-a320-46fc-81bb-557c14ad8572-service-ca\") pod \"console-7b989f78c4-xs2tx\" (UID: \"d8c0aa7a-a320-46fc-81bb-557c14ad8572\") " pod="openshift-console/console-7b989f78c4-xs2tx" Apr 17 09:14:22.118342 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:22.118300 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d8c0aa7a-a320-46fc-81bb-557c14ad8572-console-oauth-config\") pod \"console-7b989f78c4-xs2tx\" (UID: \"d8c0aa7a-a320-46fc-81bb-557c14ad8572\") " pod="openshift-console/console-7b989f78c4-xs2tx" Apr 17 09:14:22.118554 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:22.118518 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-65mvc\" (UniqueName: \"kubernetes.io/projected/d8c0aa7a-a320-46fc-81bb-557c14ad8572-kube-api-access-65mvc\") pod \"console-7b989f78c4-xs2tx\" (UID: \"d8c0aa7a-a320-46fc-81bb-557c14ad8572\") " pod="openshift-console/console-7b989f78c4-xs2tx" Apr 17 09:14:22.119168 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:22.118765 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d8c0aa7a-a320-46fc-81bb-557c14ad8572-oauth-serving-cert\") pod \"console-7b989f78c4-xs2tx\" (UID: \"d8c0aa7a-a320-46fc-81bb-557c14ad8572\") " pod="openshift-console/console-7b989f78c4-xs2tx" Apr 17 09:14:22.119297 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:22.119242 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d8c0aa7a-a320-46fc-81bb-557c14ad8572-service-ca\") pod \"console-7b989f78c4-xs2tx\" (UID: \"d8c0aa7a-a320-46fc-81bb-557c14ad8572\") " pod="openshift-console/console-7b989f78c4-xs2tx" Apr 17 09:14:22.119502 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:22.119462 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d8c0aa7a-a320-46fc-81bb-557c14ad8572-console-config\") pod \"console-7b989f78c4-xs2tx\" (UID: \"d8c0aa7a-a320-46fc-81bb-557c14ad8572\") " pod="openshift-console/console-7b989f78c4-xs2tx" Apr 17 09:14:22.121116 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:22.121089 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d8c0aa7a-a320-46fc-81bb-557c14ad8572-console-serving-cert\") pod \"console-7b989f78c4-xs2tx\" (UID: \"d8c0aa7a-a320-46fc-81bb-557c14ad8572\") " pod="openshift-console/console-7b989f78c4-xs2tx" Apr 17 09:14:22.121318 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:22.121291 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d8c0aa7a-a320-46fc-81bb-557c14ad8572-console-oauth-config\") pod \"console-7b989f78c4-xs2tx\" (UID: \"d8c0aa7a-a320-46fc-81bb-557c14ad8572\") " pod="openshift-console/console-7b989f78c4-xs2tx" Apr 17 09:14:22.127455 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:22.127432 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-65mvc\" (UniqueName: \"kubernetes.io/projected/d8c0aa7a-a320-46fc-81bb-557c14ad8572-kube-api-access-65mvc\") pod \"console-7b989f78c4-xs2tx\" (UID: \"d8c0aa7a-a320-46fc-81bb-557c14ad8572\") " pod="openshift-console/console-7b989f78c4-xs2tx" Apr 17 09:14:22.200828 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:22.200744 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b989f78c4-xs2tx" Apr 17 09:14:27.874466 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:27.874441 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b989f78c4-xs2tx"] Apr 17 09:14:27.891162 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:14:27.891124 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8c0aa7a_a320_46fc_81bb_557c14ad8572.slice/crio-82b4249ffd0e24bca734a769e2497cf91e42a6ca1bd0ee67de6d14030333e0fb WatchSource:0}: Error finding container 82b4249ffd0e24bca734a769e2497cf91e42a6ca1bd0ee67de6d14030333e0fb: Status 404 returned error can't find the container with id 82b4249ffd0e24bca734a769e2497cf91e42a6ca1bd0ee67de6d14030333e0fb Apr 17 09:14:28.377759 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:28.377711 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b989f78c4-xs2tx" event={"ID":"d8c0aa7a-a320-46fc-81bb-557c14ad8572","Type":"ContainerStarted","Data":"82b4249ffd0e24bca734a769e2497cf91e42a6ca1bd0ee67de6d14030333e0fb"} Apr 17 09:14:28.379266 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:28.379235 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-knk5v" event={"ID":"276b809a-2684-4b9d-9c50-dcc32d5cbe03","Type":"ContainerStarted","Data":"73442e4cbe37652b58ffa3ea037a23c7a5c3a6003a6b4c3e2cceed804e81d3a7"} Apr 17 09:14:28.379488 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:28.379465 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-knk5v" Apr 17 09:14:28.399371 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:28.399319 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-knk5v" podStartSLOduration=1.74988564 podStartE2EDuration="17.399305921s" podCreationTimestamp="2026-04-17 09:14:11 +0000 UTC" firstStartedPulling="2026-04-17 09:14:12.2011677 +0000 UTC m=+177.103225058" lastFinishedPulling="2026-04-17 09:14:27.850587966 +0000 UTC m=+192.752645339" observedRunningTime="2026-04-17 09:14:28.397462298 +0000 UTC m=+193.299519703" watchObservedRunningTime="2026-04-17 09:14:28.399305921 +0000 UTC m=+193.301363303" Apr 17 09:14:28.404304 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:28.404281 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-knk5v" Apr 17 09:14:30.473267 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:30.473230 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6b8d56cc57-9m2lt"] Apr 17 09:14:30.496937 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:30.496891 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b8d56cc57-9m2lt" Apr 17 09:14:30.497369 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:30.497329 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6b8d56cc57-9m2lt"] Apr 17 09:14:30.510562 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:30.510526 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 09:14:30.603253 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:30.603030 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e66c64fc-e28a-4446-9765-074be13471cf-service-ca\") pod \"console-6b8d56cc57-9m2lt\" (UID: \"e66c64fc-e28a-4446-9765-074be13471cf\") " pod="openshift-console/console-6b8d56cc57-9m2lt" Apr 17 09:14:30.603428 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:30.603318 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e66c64fc-e28a-4446-9765-074be13471cf-oauth-serving-cert\") pod \"console-6b8d56cc57-9m2lt\" (UID: \"e66c64fc-e28a-4446-9765-074be13471cf\") " pod="openshift-console/console-6b8d56cc57-9m2lt" Apr 17 09:14:30.603428 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:30.603374 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv9w4\" (UniqueName: \"kubernetes.io/projected/e66c64fc-e28a-4446-9765-074be13471cf-kube-api-access-dv9w4\") pod \"console-6b8d56cc57-9m2lt\" (UID: \"e66c64fc-e28a-4446-9765-074be13471cf\") " pod="openshift-console/console-6b8d56cc57-9m2lt" Apr 17 09:14:30.603428 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:30.603402 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e66c64fc-e28a-4446-9765-074be13471cf-console-oauth-config\") pod \"console-6b8d56cc57-9m2lt\" (UID: \"e66c64fc-e28a-4446-9765-074be13471cf\") " pod="openshift-console/console-6b8d56cc57-9m2lt" Apr 17 09:14:30.603428 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:30.603427 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e66c64fc-e28a-4446-9765-074be13471cf-trusted-ca-bundle\") pod \"console-6b8d56cc57-9m2lt\" (UID: \"e66c64fc-e28a-4446-9765-074be13471cf\") " pod="openshift-console/console-6b8d56cc57-9m2lt" Apr 17 09:14:30.603662 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:30.603477 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e66c64fc-e28a-4446-9765-074be13471cf-console-config\") pod \"console-6b8d56cc57-9m2lt\" (UID: \"e66c64fc-e28a-4446-9765-074be13471cf\") " pod="openshift-console/console-6b8d56cc57-9m2lt" Apr 17 09:14:30.603662 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:30.603540 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e66c64fc-e28a-4446-9765-074be13471cf-console-serving-cert\") pod \"console-6b8d56cc57-9m2lt\" (UID: \"e66c64fc-e28a-4446-9765-074be13471cf\") " pod="openshift-console/console-6b8d56cc57-9m2lt" Apr 17 09:14:30.704430 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:30.704389 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dv9w4\" (UniqueName: \"kubernetes.io/projected/e66c64fc-e28a-4446-9765-074be13471cf-kube-api-access-dv9w4\") pod \"console-6b8d56cc57-9m2lt\" (UID: \"e66c64fc-e28a-4446-9765-074be13471cf\") " pod="openshift-console/console-6b8d56cc57-9m2lt" Apr 17 09:14:30.704607 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:30.704440 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e66c64fc-e28a-4446-9765-074be13471cf-console-oauth-config\") pod \"console-6b8d56cc57-9m2lt\" (UID: \"e66c64fc-e28a-4446-9765-074be13471cf\") " pod="openshift-console/console-6b8d56cc57-9m2lt" Apr 17 09:14:30.704607 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:30.704559 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e66c64fc-e28a-4446-9765-074be13471cf-trusted-ca-bundle\") pod \"console-6b8d56cc57-9m2lt\" (UID: \"e66c64fc-e28a-4446-9765-074be13471cf\") " pod="openshift-console/console-6b8d56cc57-9m2lt" Apr 17 09:14:30.704708 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:30.704675 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e66c64fc-e28a-4446-9765-074be13471cf-console-config\") pod \"console-6b8d56cc57-9m2lt\" (UID: \"e66c64fc-e28a-4446-9765-074be13471cf\") " pod="openshift-console/console-6b8d56cc57-9m2lt" Apr 17 09:14:30.704761 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:30.704750 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e66c64fc-e28a-4446-9765-074be13471cf-console-serving-cert\") pod \"console-6b8d56cc57-9m2lt\" (UID: \"e66c64fc-e28a-4446-9765-074be13471cf\") " pod="openshift-console/console-6b8d56cc57-9m2lt" Apr 17 09:14:30.704818 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:30.704796 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e66c64fc-e28a-4446-9765-074be13471cf-service-ca\") pod \"console-6b8d56cc57-9m2lt\" (UID: \"e66c64fc-e28a-4446-9765-074be13471cf\") " pod="openshift-console/console-6b8d56cc57-9m2lt" Apr 17 09:14:30.704905 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:30.704886 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e66c64fc-e28a-4446-9765-074be13471cf-oauth-serving-cert\") pod \"console-6b8d56cc57-9m2lt\" (UID: \"e66c64fc-e28a-4446-9765-074be13471cf\") " pod="openshift-console/console-6b8d56cc57-9m2lt" Apr 17 09:14:30.705874 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:30.705788 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e66c64fc-e28a-4446-9765-074be13471cf-trusted-ca-bundle\") pod \"console-6b8d56cc57-9m2lt\" (UID: \"e66c64fc-e28a-4446-9765-074be13471cf\") " pod="openshift-console/console-6b8d56cc57-9m2lt" Apr 17 09:14:30.705999 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:30.705899 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e66c64fc-e28a-4446-9765-074be13471cf-oauth-serving-cert\") pod \"console-6b8d56cc57-9m2lt\" (UID: \"e66c64fc-e28a-4446-9765-074be13471cf\") " pod="openshift-console/console-6b8d56cc57-9m2lt" Apr 17 09:14:30.705999 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:30.705926 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e66c64fc-e28a-4446-9765-074be13471cf-service-ca\") pod \"console-6b8d56cc57-9m2lt\" (UID: \"e66c64fc-e28a-4446-9765-074be13471cf\") " pod="openshift-console/console-6b8d56cc57-9m2lt" Apr 17 09:14:30.706256 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:30.706230 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e66c64fc-e28a-4446-9765-074be13471cf-console-config\") pod \"console-6b8d56cc57-9m2lt\" (UID: \"e66c64fc-e28a-4446-9765-074be13471cf\") " pod="openshift-console/console-6b8d56cc57-9m2lt" Apr 17 09:14:30.707508 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:30.707485 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e66c64fc-e28a-4446-9765-074be13471cf-console-oauth-config\") pod \"console-6b8d56cc57-9m2lt\" (UID: \"e66c64fc-e28a-4446-9765-074be13471cf\") " pod="openshift-console/console-6b8d56cc57-9m2lt" Apr 17 09:14:30.707612 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:30.707485 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e66c64fc-e28a-4446-9765-074be13471cf-console-serving-cert\") pod \"console-6b8d56cc57-9m2lt\" (UID: \"e66c64fc-e28a-4446-9765-074be13471cf\") " pod="openshift-console/console-6b8d56cc57-9m2lt" Apr 17 09:14:30.713932 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:30.713555 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv9w4\" (UniqueName: \"kubernetes.io/projected/e66c64fc-e28a-4446-9765-074be13471cf-kube-api-access-dv9w4\") pod \"console-6b8d56cc57-9m2lt\" (UID: \"e66c64fc-e28a-4446-9765-074be13471cf\") " pod="openshift-console/console-6b8d56cc57-9m2lt" Apr 17 09:14:30.812229 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:30.812078 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b8d56cc57-9m2lt" Apr 17 09:14:31.341812 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:31.341781 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6b8d56cc57-9m2lt"] Apr 17 09:14:31.446730 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:14:31.446693 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode66c64fc_e28a_4446_9765_074be13471cf.slice/crio-dc6ee903d069beb0868fbcd3e4500e24082459bc47cc678d3e8225872b6e0906 WatchSource:0}: Error finding container dc6ee903d069beb0868fbcd3e4500e24082459bc47cc678d3e8225872b6e0906: Status 404 returned error can't find the container with id dc6ee903d069beb0868fbcd3e4500e24082459bc47cc678d3e8225872b6e0906 Apr 17 09:14:32.395481 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:32.395436 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b8d56cc57-9m2lt" event={"ID":"e66c64fc-e28a-4446-9765-074be13471cf","Type":"ContainerStarted","Data":"b638f48380d218d0aea67953408e6f70defbdbdc28e42d57d30e83f7a0a0f172"} Apr 17 09:14:32.395481 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:32.395485 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b8d56cc57-9m2lt" event={"ID":"e66c64fc-e28a-4446-9765-074be13471cf","Type":"ContainerStarted","Data":"dc6ee903d069beb0868fbcd3e4500e24082459bc47cc678d3e8225872b6e0906"} Apr 17 09:14:32.396986 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:32.396948 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b989f78c4-xs2tx" event={"ID":"d8c0aa7a-a320-46fc-81bb-557c14ad8572","Type":"ContainerStarted","Data":"2a3aaebdbcd382babbc500d8aa02279480919897c5b9e18272864f85b63587c0"} Apr 17 09:14:32.414732 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:32.414685 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6b8d56cc57-9m2lt" podStartSLOduration=2.414671301 podStartE2EDuration="2.414671301s" podCreationTimestamp="2026-04-17 09:14:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 09:14:32.413741493 +0000 UTC m=+197.315798875" watchObservedRunningTime="2026-04-17 09:14:32.414671301 +0000 UTC m=+197.316728681" Apr 17 09:14:32.432220 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:32.432178 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7b989f78c4-xs2tx" podStartSLOduration=7.845404811 podStartE2EDuration="11.432164521s" podCreationTimestamp="2026-04-17 09:14:21 +0000 UTC" firstStartedPulling="2026-04-17 09:14:27.893145772 +0000 UTC m=+192.795203132" lastFinishedPulling="2026-04-17 09:14:31.479905484 +0000 UTC m=+196.381962842" observedRunningTime="2026-04-17 09:14:32.431962274 +0000 UTC m=+197.334019678" watchObservedRunningTime="2026-04-17 09:14:32.432164521 +0000 UTC m=+197.334221914" Apr 17 09:14:38.419330 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:38.419295 2578 generic.go:358] "Generic (PLEG): container finished" podID="153b51ca-f712-4926-8c50-8e76eed97427" containerID="f28f944ed6570797fba8e8538daebffea9449de9229f816376477a670bbeadbd" exitCode=0 Apr 17 09:14:38.419913 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:38.419357 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kbtlr" event={"ID":"153b51ca-f712-4926-8c50-8e76eed97427","Type":"ContainerDied","Data":"f28f944ed6570797fba8e8538daebffea9449de9229f816376477a670bbeadbd"} Apr 17 09:14:38.419913 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:38.419739 2578 scope.go:117] "RemoveContainer" containerID="f28f944ed6570797fba8e8538daebffea9449de9229f816376477a670bbeadbd" Apr 17 09:14:39.424321 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:39.424284 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kbtlr" event={"ID":"153b51ca-f712-4926-8c50-8e76eed97427","Type":"ContainerStarted","Data":"609286b461ee1e73e6c734a7c23be0602c453da5fe378054f26e0bc11bca2206"} Apr 17 09:14:40.813161 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:40.813127 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6b8d56cc57-9m2lt" Apr 17 09:14:40.813161 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:40.813167 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6b8d56cc57-9m2lt" Apr 17 09:14:40.817945 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:40.817926 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6b8d56cc57-9m2lt" Apr 17 09:14:40.903109 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:40.903086 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-884d56797-hgwvz" Apr 17 09:14:40.906846 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:40.906808 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-884d56797-hgwvz" Apr 17 09:14:41.433318 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:41.433292 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6b8d56cc57-9m2lt" Apr 17 09:14:41.478636 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:41.478600 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7b989f78c4-xs2tx"] Apr 17 09:14:42.201066 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:42.201033 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7b989f78c4-xs2tx" Apr 17 09:14:46.443576 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:46.443503 2578 generic.go:358] "Generic (PLEG): container finished" podID="df06ee4d-da4b-4812-876f-8b39a0419cca" containerID="bc67b0da56e6243ebc2cd3bc4f883ccd2e8f154e8bdf062cc1f5c9975b10c1cd" exitCode=0 Apr 17 09:14:46.443965 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:46.443575 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-rlbzq" event={"ID":"df06ee4d-da4b-4812-876f-8b39a0419cca","Type":"ContainerDied","Data":"bc67b0da56e6243ebc2cd3bc4f883ccd2e8f154e8bdf062cc1f5c9975b10c1cd"} Apr 17 09:14:46.443965 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:46.443906 2578 scope.go:117] "RemoveContainer" containerID="bc67b0da56e6243ebc2cd3bc4f883ccd2e8f154e8bdf062cc1f5c9975b10c1cd" Apr 17 09:14:47.448211 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:14:47.448178 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-rlbzq" event={"ID":"df06ee4d-da4b-4812-876f-8b39a0419cca","Type":"ContainerStarted","Data":"2c64fd6c00ddc1c5cd568cfb05752354b8379aea9a0a96e20071594bf4d1eebb"} Apr 17 09:15:06.498593 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:06.498528 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7b989f78c4-xs2tx" podUID="d8c0aa7a-a320-46fc-81bb-557c14ad8572" containerName="console" containerID="cri-o://2a3aaebdbcd382babbc500d8aa02279480919897c5b9e18272864f85b63587c0" gracePeriod=15 Apr 17 09:15:06.767602 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:06.767577 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7b989f78c4-xs2tx_d8c0aa7a-a320-46fc-81bb-557c14ad8572/console/0.log" Apr 17 09:15:06.767750 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:06.767654 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b989f78c4-xs2tx" Apr 17 09:15:06.914806 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:06.914767 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d8c0aa7a-a320-46fc-81bb-557c14ad8572-console-config\") pod \"d8c0aa7a-a320-46fc-81bb-557c14ad8572\" (UID: \"d8c0aa7a-a320-46fc-81bb-557c14ad8572\") " Apr 17 09:15:06.914806 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:06.914810 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d8c0aa7a-a320-46fc-81bb-557c14ad8572-console-serving-cert\") pod \"d8c0aa7a-a320-46fc-81bb-557c14ad8572\" (UID: \"d8c0aa7a-a320-46fc-81bb-557c14ad8572\") " Apr 17 09:15:06.915039 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:06.914881 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d8c0aa7a-a320-46fc-81bb-557c14ad8572-console-oauth-config\") pod \"d8c0aa7a-a320-46fc-81bb-557c14ad8572\" (UID: \"d8c0aa7a-a320-46fc-81bb-557c14ad8572\") " Apr 17 09:15:06.915039 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:06.914905 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d8c0aa7a-a320-46fc-81bb-557c14ad8572-oauth-serving-cert\") pod \"d8c0aa7a-a320-46fc-81bb-557c14ad8572\" (UID: \"d8c0aa7a-a320-46fc-81bb-557c14ad8572\") " Apr 17 09:15:06.915121 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:06.915053 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d8c0aa7a-a320-46fc-81bb-557c14ad8572-service-ca\") pod \"d8c0aa7a-a320-46fc-81bb-557c14ad8572\" (UID: \"d8c0aa7a-a320-46fc-81bb-557c14ad8572\") " Apr 17 09:15:06.915179 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:06.915115 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65mvc\" (UniqueName: \"kubernetes.io/projected/d8c0aa7a-a320-46fc-81bb-557c14ad8572-kube-api-access-65mvc\") pod \"d8c0aa7a-a320-46fc-81bb-557c14ad8572\" (UID: \"d8c0aa7a-a320-46fc-81bb-557c14ad8572\") " Apr 17 09:15:06.915329 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:06.915288 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8c0aa7a-a320-46fc-81bb-557c14ad8572-console-config" (OuterVolumeSpecName: "console-config") pod "d8c0aa7a-a320-46fc-81bb-557c14ad8572" (UID: "d8c0aa7a-a320-46fc-81bb-557c14ad8572"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 09:15:06.915329 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:06.915303 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8c0aa7a-a320-46fc-81bb-557c14ad8572-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d8c0aa7a-a320-46fc-81bb-557c14ad8572" (UID: "d8c0aa7a-a320-46fc-81bb-557c14ad8572"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 09:15:06.915491 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:06.915412 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d8c0aa7a-a320-46fc-81bb-557c14ad8572-oauth-serving-cert\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:15:06.915491 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:06.915426 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d8c0aa7a-a320-46fc-81bb-557c14ad8572-console-config\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:15:06.915491 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:06.915421 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8c0aa7a-a320-46fc-81bb-557c14ad8572-service-ca" (OuterVolumeSpecName: "service-ca") pod "d8c0aa7a-a320-46fc-81bb-557c14ad8572" (UID: "d8c0aa7a-a320-46fc-81bb-557c14ad8572"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 09:15:06.917222 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:06.917194 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8c0aa7a-a320-46fc-81bb-557c14ad8572-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d8c0aa7a-a320-46fc-81bb-557c14ad8572" (UID: "d8c0aa7a-a320-46fc-81bb-557c14ad8572"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 09:15:06.917330 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:06.917278 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8c0aa7a-a320-46fc-81bb-557c14ad8572-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d8c0aa7a-a320-46fc-81bb-557c14ad8572" (UID: "d8c0aa7a-a320-46fc-81bb-557c14ad8572"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 09:15:06.917330 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:06.917304 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8c0aa7a-a320-46fc-81bb-557c14ad8572-kube-api-access-65mvc" (OuterVolumeSpecName: "kube-api-access-65mvc") pod "d8c0aa7a-a320-46fc-81bb-557c14ad8572" (UID: "d8c0aa7a-a320-46fc-81bb-557c14ad8572"). InnerVolumeSpecName "kube-api-access-65mvc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 09:15:07.016569 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:07.016486 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d8c0aa7a-a320-46fc-81bb-557c14ad8572-console-serving-cert\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:15:07.016569 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:07.016515 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d8c0aa7a-a320-46fc-81bb-557c14ad8572-console-oauth-config\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:15:07.016569 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:07.016528 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d8c0aa7a-a320-46fc-81bb-557c14ad8572-service-ca\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:15:07.016569 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:07.016542 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-65mvc\" (UniqueName: \"kubernetes.io/projected/d8c0aa7a-a320-46fc-81bb-557c14ad8572-kube-api-access-65mvc\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:15:07.509851 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:07.509815 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7b989f78c4-xs2tx_d8c0aa7a-a320-46fc-81bb-557c14ad8572/console/0.log" Apr 17 09:15:07.510229 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:07.509874 2578 generic.go:358] "Generic (PLEG): container finished" podID="d8c0aa7a-a320-46fc-81bb-557c14ad8572" containerID="2a3aaebdbcd382babbc500d8aa02279480919897c5b9e18272864f85b63587c0" exitCode=2 Apr 17 09:15:07.510229 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:07.509932 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b989f78c4-xs2tx" event={"ID":"d8c0aa7a-a320-46fc-81bb-557c14ad8572","Type":"ContainerDied","Data":"2a3aaebdbcd382babbc500d8aa02279480919897c5b9e18272864f85b63587c0"} Apr 17 09:15:07.510229 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:07.509937 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b989f78c4-xs2tx" Apr 17 09:15:07.510229 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:07.509957 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b989f78c4-xs2tx" event={"ID":"d8c0aa7a-a320-46fc-81bb-557c14ad8572","Type":"ContainerDied","Data":"82b4249ffd0e24bca734a769e2497cf91e42a6ca1bd0ee67de6d14030333e0fb"} Apr 17 09:15:07.510229 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:07.509974 2578 scope.go:117] "RemoveContainer" containerID="2a3aaebdbcd382babbc500d8aa02279480919897c5b9e18272864f85b63587c0" Apr 17 09:15:07.518072 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:07.518055 2578 scope.go:117] "RemoveContainer" containerID="2a3aaebdbcd382babbc500d8aa02279480919897c5b9e18272864f85b63587c0" Apr 17 09:15:07.518305 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:15:07.518288 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a3aaebdbcd382babbc500d8aa02279480919897c5b9e18272864f85b63587c0\": container with ID starting with 2a3aaebdbcd382babbc500d8aa02279480919897c5b9e18272864f85b63587c0 not found: ID does not exist" containerID="2a3aaebdbcd382babbc500d8aa02279480919897c5b9e18272864f85b63587c0" Apr 17 09:15:07.518353 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:07.518313 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a3aaebdbcd382babbc500d8aa02279480919897c5b9e18272864f85b63587c0"} err="failed to get container status \"2a3aaebdbcd382babbc500d8aa02279480919897c5b9e18272864f85b63587c0\": rpc error: code = NotFound desc = could not find container \"2a3aaebdbcd382babbc500d8aa02279480919897c5b9e18272864f85b63587c0\": container with ID starting with 2a3aaebdbcd382babbc500d8aa02279480919897c5b9e18272864f85b63587c0 not found: ID does not exist" Apr 17 09:15:07.533361 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:07.533333 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7b989f78c4-xs2tx"] Apr 17 09:15:07.535347 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:07.535327 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7b989f78c4-xs2tx"] Apr 17 09:15:07.659779 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:07.659740 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8c0aa7a-a320-46fc-81bb-557c14ad8572" path="/var/lib/kubelet/pods/d8c0aa7a-a320-46fc-81bb-557c14ad8572/volumes" Apr 17 09:15:16.565693 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:16.565653 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 09:15:16.566208 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:16.566153 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5" containerName="kube-rbac-proxy-web" containerID="cri-o://324ce7296e0b468dd4daf7602dd2e772b9c33e78d99ece606d05af311bb413a8" gracePeriod=120 Apr 17 09:15:16.566301 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:16.566198 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5" containerName="config-reloader" containerID="cri-o://67f3aba96ba48696a3e4ed7db58e88192134089d91b0e5b270d968a855daf155" gracePeriod=120 Apr 17 09:15:16.566301 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:16.566164 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5" containerName="kube-rbac-proxy-metric" containerID="cri-o://c1c28e44c8c3d702ccb9a0f5d62b86d7dab4b56aa01ab5ab2523ee3389ab6ab8" gracePeriod=120 Apr 17 09:15:16.566301 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:16.566220 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5" containerName="prom-label-proxy" containerID="cri-o://df5cc47276226d0fc84cd75238a101169c99ec536958e93d79b3d0cefebbca6d" gracePeriod=120 Apr 17 09:15:16.566438 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:16.566139 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5" containerName="alertmanager" containerID="cri-o://bac4e1f2357d95cc5b6737ba8179c717eabcc4214c1ac47a343b42dceb719c61" gracePeriod=120 Apr 17 09:15:16.566438 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:16.566198 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5" containerName="kube-rbac-proxy" containerID="cri-o://339a7513979215caf825c520a08adc3d4f58ba03d2d5324f7bd2d08e2e0f7d95" gracePeriod=120 Apr 17 09:15:17.543713 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:17.543681 2578 generic.go:358] "Generic (PLEG): container finished" podID="ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5" containerID="df5cc47276226d0fc84cd75238a101169c99ec536958e93d79b3d0cefebbca6d" exitCode=0 Apr 17 09:15:17.543713 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:17.543707 2578 generic.go:358] "Generic (PLEG): container finished" podID="ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5" containerID="339a7513979215caf825c520a08adc3d4f58ba03d2d5324f7bd2d08e2e0f7d95" exitCode=0 Apr 17 09:15:17.543713 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:17.543713 2578 generic.go:358] "Generic (PLEG): container finished" podID="ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5" containerID="67f3aba96ba48696a3e4ed7db58e88192134089d91b0e5b270d968a855daf155" exitCode=0 Apr 17 09:15:17.543713 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:17.543719 2578 generic.go:358] "Generic (PLEG): container finished" podID="ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5" containerID="bac4e1f2357d95cc5b6737ba8179c717eabcc4214c1ac47a343b42dceb719c61" exitCode=0 Apr 17 09:15:17.543997 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:17.543749 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5","Type":"ContainerDied","Data":"df5cc47276226d0fc84cd75238a101169c99ec536958e93d79b3d0cefebbca6d"} Apr 17 09:15:17.543997 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:17.543784 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5","Type":"ContainerDied","Data":"339a7513979215caf825c520a08adc3d4f58ba03d2d5324f7bd2d08e2e0f7d95"} Apr 17 09:15:17.543997 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:17.543794 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5","Type":"ContainerDied","Data":"67f3aba96ba48696a3e4ed7db58e88192134089d91b0e5b270d968a855daf155"} Apr 17 09:15:17.543997 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:17.543804 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5","Type":"ContainerDied","Data":"bac4e1f2357d95cc5b6737ba8179c717eabcc4214c1ac47a343b42dceb719c61"} Apr 17 09:15:17.811814 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:17.811793 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:15:17.904017 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:17.903990 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpdf9\" (UniqueName: \"kubernetes.io/projected/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-kube-api-access-rpdf9\") pod \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " Apr 17 09:15:17.904171 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:17.904041 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-config-out\") pod \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " Apr 17 09:15:17.904171 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:17.904064 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-config-volume\") pod \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " Apr 17 09:15:17.904171 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:17.904084 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-secret-alertmanager-main-tls\") pod \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " Apr 17 09:15:17.904171 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:17.904108 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-tls-assets\") pod \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " Apr 17 09:15:17.904171 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:17.904133 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-alertmanager-trusted-ca-bundle\") pod \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " Apr 17 09:15:17.904171 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:17.904164 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-secret-alertmanager-kube-rbac-proxy-web\") pod \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " Apr 17 09:15:17.904446 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:17.904207 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-secret-alertmanager-kube-rbac-proxy\") pod \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " Apr 17 09:15:17.904446 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:17.904225 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-secret-alertmanager-kube-rbac-proxy-metric\") pod \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " Apr 17 09:15:17.904446 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:17.904271 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-alertmanager-main-db\") pod \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " Apr 17 09:15:17.904446 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:17.904298 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-web-config\") pod \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " Apr 17 09:15:17.904446 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:17.904328 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-cluster-tls-config\") pod \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " Apr 17 09:15:17.904446 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:17.904352 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-metrics-client-ca\") pod \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\" (UID: \"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5\") " Apr 17 09:15:17.905142 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:17.904802 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5" (UID: "ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 09:15:17.905142 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:17.904858 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5" (UID: "ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 09:15:17.905777 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:17.905746 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5" (UID: "ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 09:15:17.906903 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:17.906868 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5" (UID: "ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 09:15:17.907066 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:17.907036 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5" (UID: "ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 09:15:17.907208 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:17.907184 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-config-volume" (OuterVolumeSpecName: "config-volume") pod "ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5" (UID: "ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 09:15:17.907289 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:17.907211 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-kube-api-access-rpdf9" (OuterVolumeSpecName: "kube-api-access-rpdf9") pod "ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5" (UID: "ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5"). InnerVolumeSpecName "kube-api-access-rpdf9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 09:15:17.907494 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:17.907466 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5" (UID: "ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 09:15:17.908039 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:17.908012 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5" (UID: "ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 09:15:17.908517 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:17.908493 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-config-out" (OuterVolumeSpecName: "config-out") pod "ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5" (UID: "ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 09:15:17.908743 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:17.908718 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5" (UID: "ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 09:15:17.912640 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:17.912621 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5" (UID: "ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 09:15:17.918042 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:17.918020 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-web-config" (OuterVolumeSpecName: "web-config") pod "ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5" (UID: "ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 09:15:18.005638 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.005617 2578 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-tls-assets\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:15:18.005726 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.005642 2578 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:15:18.005726 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.005659 2578 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:15:18.005726 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.005669 2578 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:15:18.005726 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.005678 2578 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:15:18.005726 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.005689 2578 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-alertmanager-main-db\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:15:18.005726 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.005698 2578 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-web-config\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:15:18.005726 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.005707 2578 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-cluster-tls-config\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:15:18.005726 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.005716 2578 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-metrics-client-ca\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:15:18.005726 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.005725 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rpdf9\" (UniqueName: \"kubernetes.io/projected/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-kube-api-access-rpdf9\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:15:18.006017 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.005734 2578 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-config-out\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:15:18.006017 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.005746 2578 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-config-volume\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:15:18.006017 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.005756 2578 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5-secret-alertmanager-main-tls\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:15:18.550237 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.550204 2578 generic.go:358] "Generic (PLEG): container finished" podID="ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5" containerID="c1c28e44c8c3d702ccb9a0f5d62b86d7dab4b56aa01ab5ab2523ee3389ab6ab8" exitCode=0 Apr 17 09:15:18.550237 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.550228 2578 generic.go:358] "Generic (PLEG): container finished" podID="ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5" containerID="324ce7296e0b468dd4daf7602dd2e772b9c33e78d99ece606d05af311bb413a8" exitCode=0 Apr 17 09:15:18.550421 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.550281 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5","Type":"ContainerDied","Data":"c1c28e44c8c3d702ccb9a0f5d62b86d7dab4b56aa01ab5ab2523ee3389ab6ab8"} Apr 17 09:15:18.550421 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.550312 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:15:18.550421 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.550321 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5","Type":"ContainerDied","Data":"324ce7296e0b468dd4daf7602dd2e772b9c33e78d99ece606d05af311bb413a8"} Apr 17 09:15:18.550421 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.550335 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5","Type":"ContainerDied","Data":"1fb979e19d5f312c7a6d245e500275497750d15c2d8132755bd8e4bb9271419b"} Apr 17 09:15:18.550421 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.550350 2578 scope.go:117] "RemoveContainer" containerID="df5cc47276226d0fc84cd75238a101169c99ec536958e93d79b3d0cefebbca6d" Apr 17 09:15:18.557472 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.557450 2578 scope.go:117] "RemoveContainer" containerID="c1c28e44c8c3d702ccb9a0f5d62b86d7dab4b56aa01ab5ab2523ee3389ab6ab8" Apr 17 09:15:18.564164 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.564146 2578 scope.go:117] "RemoveContainer" containerID="339a7513979215caf825c520a08adc3d4f58ba03d2d5324f7bd2d08e2e0f7d95" Apr 17 09:15:18.570230 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.570206 2578 scope.go:117] "RemoveContainer" containerID="324ce7296e0b468dd4daf7602dd2e772b9c33e78d99ece606d05af311bb413a8" Apr 17 09:15:18.575353 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.575325 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 09:15:18.577297 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.577281 2578 scope.go:117] "RemoveContainer" containerID="67f3aba96ba48696a3e4ed7db58e88192134089d91b0e5b270d968a855daf155" Apr 17 09:15:18.578609 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.578590 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 09:15:18.583354 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.583337 2578 scope.go:117] "RemoveContainer" containerID="bac4e1f2357d95cc5b6737ba8179c717eabcc4214c1ac47a343b42dceb719c61" Apr 17 09:15:18.589368 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.589351 2578 scope.go:117] "RemoveContainer" containerID="95d2c9f400a1813e68d46aff1d54d2170c6d6212991a3cb1b7babf6395ad9a1a" Apr 17 09:15:18.595394 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.595379 2578 scope.go:117] "RemoveContainer" containerID="df5cc47276226d0fc84cd75238a101169c99ec536958e93d79b3d0cefebbca6d" Apr 17 09:15:18.595636 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:15:18.595617 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df5cc47276226d0fc84cd75238a101169c99ec536958e93d79b3d0cefebbca6d\": container with ID starting with df5cc47276226d0fc84cd75238a101169c99ec536958e93d79b3d0cefebbca6d not found: ID does not exist" containerID="df5cc47276226d0fc84cd75238a101169c99ec536958e93d79b3d0cefebbca6d" Apr 17 09:15:18.595679 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.595645 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df5cc47276226d0fc84cd75238a101169c99ec536958e93d79b3d0cefebbca6d"} err="failed to get container status \"df5cc47276226d0fc84cd75238a101169c99ec536958e93d79b3d0cefebbca6d\": rpc error: code = NotFound desc = could not find container \"df5cc47276226d0fc84cd75238a101169c99ec536958e93d79b3d0cefebbca6d\": container with ID starting with df5cc47276226d0fc84cd75238a101169c99ec536958e93d79b3d0cefebbca6d not found: ID does not exist" Apr 17 09:15:18.595679 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.595663 2578 scope.go:117] "RemoveContainer" containerID="c1c28e44c8c3d702ccb9a0f5d62b86d7dab4b56aa01ab5ab2523ee3389ab6ab8" Apr 17 09:15:18.595926 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:15:18.595896 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1c28e44c8c3d702ccb9a0f5d62b86d7dab4b56aa01ab5ab2523ee3389ab6ab8\": container with ID starting with c1c28e44c8c3d702ccb9a0f5d62b86d7dab4b56aa01ab5ab2523ee3389ab6ab8 not found: ID does not exist" containerID="c1c28e44c8c3d702ccb9a0f5d62b86d7dab4b56aa01ab5ab2523ee3389ab6ab8" Apr 17 09:15:18.596013 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.595932 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1c28e44c8c3d702ccb9a0f5d62b86d7dab4b56aa01ab5ab2523ee3389ab6ab8"} err="failed to get container status \"c1c28e44c8c3d702ccb9a0f5d62b86d7dab4b56aa01ab5ab2523ee3389ab6ab8\": rpc error: code = NotFound desc = could not find container \"c1c28e44c8c3d702ccb9a0f5d62b86d7dab4b56aa01ab5ab2523ee3389ab6ab8\": container with ID starting with c1c28e44c8c3d702ccb9a0f5d62b86d7dab4b56aa01ab5ab2523ee3389ab6ab8 not found: ID does not exist" Apr 17 09:15:18.596013 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.595948 2578 scope.go:117] "RemoveContainer" containerID="339a7513979215caf825c520a08adc3d4f58ba03d2d5324f7bd2d08e2e0f7d95" Apr 17 09:15:18.596164 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:15:18.596150 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"339a7513979215caf825c520a08adc3d4f58ba03d2d5324f7bd2d08e2e0f7d95\": container with ID starting with 339a7513979215caf825c520a08adc3d4f58ba03d2d5324f7bd2d08e2e0f7d95 not found: ID does not exist" containerID="339a7513979215caf825c520a08adc3d4f58ba03d2d5324f7bd2d08e2e0f7d95" Apr 17 09:15:18.596204 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.596168 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"339a7513979215caf825c520a08adc3d4f58ba03d2d5324f7bd2d08e2e0f7d95"} err="failed to get container status \"339a7513979215caf825c520a08adc3d4f58ba03d2d5324f7bd2d08e2e0f7d95\": rpc error: code = NotFound desc = could not find container \"339a7513979215caf825c520a08adc3d4f58ba03d2d5324f7bd2d08e2e0f7d95\": container with ID starting with 339a7513979215caf825c520a08adc3d4f58ba03d2d5324f7bd2d08e2e0f7d95 not found: ID does not exist" Apr 17 09:15:18.596204 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.596181 2578 scope.go:117] "RemoveContainer" containerID="324ce7296e0b468dd4daf7602dd2e772b9c33e78d99ece606d05af311bb413a8" Apr 17 09:15:18.596407 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:15:18.596391 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"324ce7296e0b468dd4daf7602dd2e772b9c33e78d99ece606d05af311bb413a8\": container with ID starting with 324ce7296e0b468dd4daf7602dd2e772b9c33e78d99ece606d05af311bb413a8 not found: ID does not exist" containerID="324ce7296e0b468dd4daf7602dd2e772b9c33e78d99ece606d05af311bb413a8" Apr 17 09:15:18.596452 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.596410 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"324ce7296e0b468dd4daf7602dd2e772b9c33e78d99ece606d05af311bb413a8"} err="failed to get container status \"324ce7296e0b468dd4daf7602dd2e772b9c33e78d99ece606d05af311bb413a8\": rpc error: code = NotFound desc = could not find container \"324ce7296e0b468dd4daf7602dd2e772b9c33e78d99ece606d05af311bb413a8\": container with ID starting with 324ce7296e0b468dd4daf7602dd2e772b9c33e78d99ece606d05af311bb413a8 not found: ID does not exist" Apr 17 09:15:18.596452 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.596424 2578 scope.go:117] "RemoveContainer" containerID="67f3aba96ba48696a3e4ed7db58e88192134089d91b0e5b270d968a855daf155" Apr 17 09:15:18.596617 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:15:18.596602 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67f3aba96ba48696a3e4ed7db58e88192134089d91b0e5b270d968a855daf155\": container with ID starting with 67f3aba96ba48696a3e4ed7db58e88192134089d91b0e5b270d968a855daf155 not found: ID does not exist" containerID="67f3aba96ba48696a3e4ed7db58e88192134089d91b0e5b270d968a855daf155" Apr 17 09:15:18.596664 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.596620 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67f3aba96ba48696a3e4ed7db58e88192134089d91b0e5b270d968a855daf155"} err="failed to get container status \"67f3aba96ba48696a3e4ed7db58e88192134089d91b0e5b270d968a855daf155\": rpc error: code = NotFound desc = could not find container \"67f3aba96ba48696a3e4ed7db58e88192134089d91b0e5b270d968a855daf155\": container with ID starting with 67f3aba96ba48696a3e4ed7db58e88192134089d91b0e5b270d968a855daf155 not found: ID does not exist" Apr 17 09:15:18.596664 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.596632 2578 scope.go:117] "RemoveContainer" containerID="bac4e1f2357d95cc5b6737ba8179c717eabcc4214c1ac47a343b42dceb719c61" Apr 17 09:15:18.596860 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:15:18.596822 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bac4e1f2357d95cc5b6737ba8179c717eabcc4214c1ac47a343b42dceb719c61\": container with ID starting with bac4e1f2357d95cc5b6737ba8179c717eabcc4214c1ac47a343b42dceb719c61 not found: ID does not exist" containerID="bac4e1f2357d95cc5b6737ba8179c717eabcc4214c1ac47a343b42dceb719c61" Apr 17 09:15:18.596912 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.596859 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bac4e1f2357d95cc5b6737ba8179c717eabcc4214c1ac47a343b42dceb719c61"} err="failed to get container status \"bac4e1f2357d95cc5b6737ba8179c717eabcc4214c1ac47a343b42dceb719c61\": rpc error: code = NotFound desc = could not find container \"bac4e1f2357d95cc5b6737ba8179c717eabcc4214c1ac47a343b42dceb719c61\": container with ID starting with bac4e1f2357d95cc5b6737ba8179c717eabcc4214c1ac47a343b42dceb719c61 not found: ID does not exist" Apr 17 09:15:18.596912 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.596877 2578 scope.go:117] "RemoveContainer" containerID="95d2c9f400a1813e68d46aff1d54d2170c6d6212991a3cb1b7babf6395ad9a1a" Apr 17 09:15:18.597080 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:15:18.597063 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95d2c9f400a1813e68d46aff1d54d2170c6d6212991a3cb1b7babf6395ad9a1a\": container with ID starting with 95d2c9f400a1813e68d46aff1d54d2170c6d6212991a3cb1b7babf6395ad9a1a not found: ID does not exist" containerID="95d2c9f400a1813e68d46aff1d54d2170c6d6212991a3cb1b7babf6395ad9a1a" Apr 17 09:15:18.597118 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.597082 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95d2c9f400a1813e68d46aff1d54d2170c6d6212991a3cb1b7babf6395ad9a1a"} err="failed to get container status \"95d2c9f400a1813e68d46aff1d54d2170c6d6212991a3cb1b7babf6395ad9a1a\": rpc error: code = NotFound desc = could not find container \"95d2c9f400a1813e68d46aff1d54d2170c6d6212991a3cb1b7babf6395ad9a1a\": container with ID starting with 95d2c9f400a1813e68d46aff1d54d2170c6d6212991a3cb1b7babf6395ad9a1a not found: ID does not exist" Apr 17 09:15:18.597118 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.597094 2578 scope.go:117] "RemoveContainer" containerID="df5cc47276226d0fc84cd75238a101169c99ec536958e93d79b3d0cefebbca6d" Apr 17 09:15:18.597286 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.597269 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df5cc47276226d0fc84cd75238a101169c99ec536958e93d79b3d0cefebbca6d"} err="failed to get container status \"df5cc47276226d0fc84cd75238a101169c99ec536958e93d79b3d0cefebbca6d\": rpc error: code = NotFound desc = could not find container \"df5cc47276226d0fc84cd75238a101169c99ec536958e93d79b3d0cefebbca6d\": container with ID starting with df5cc47276226d0fc84cd75238a101169c99ec536958e93d79b3d0cefebbca6d not found: ID does not exist" Apr 17 09:15:18.597286 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.597283 2578 scope.go:117] "RemoveContainer" containerID="c1c28e44c8c3d702ccb9a0f5d62b86d7dab4b56aa01ab5ab2523ee3389ab6ab8" Apr 17 09:15:18.597549 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.597520 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1c28e44c8c3d702ccb9a0f5d62b86d7dab4b56aa01ab5ab2523ee3389ab6ab8"} err="failed to get container status \"c1c28e44c8c3d702ccb9a0f5d62b86d7dab4b56aa01ab5ab2523ee3389ab6ab8\": rpc error: code = NotFound desc = could not find container \"c1c28e44c8c3d702ccb9a0f5d62b86d7dab4b56aa01ab5ab2523ee3389ab6ab8\": container with ID starting with c1c28e44c8c3d702ccb9a0f5d62b86d7dab4b56aa01ab5ab2523ee3389ab6ab8 not found: ID does not exist" Apr 17 09:15:18.597549 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.597548 2578 scope.go:117] "RemoveContainer" containerID="339a7513979215caf825c520a08adc3d4f58ba03d2d5324f7bd2d08e2e0f7d95" Apr 17 09:15:18.597896 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.597816 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"339a7513979215caf825c520a08adc3d4f58ba03d2d5324f7bd2d08e2e0f7d95"} err="failed to get container status \"339a7513979215caf825c520a08adc3d4f58ba03d2d5324f7bd2d08e2e0f7d95\": rpc error: code = NotFound desc = could not find container \"339a7513979215caf825c520a08adc3d4f58ba03d2d5324f7bd2d08e2e0f7d95\": container with ID starting with 339a7513979215caf825c520a08adc3d4f58ba03d2d5324f7bd2d08e2e0f7d95 not found: ID does not exist" Apr 17 09:15:18.597896 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.597873 2578 scope.go:117] "RemoveContainer" containerID="324ce7296e0b468dd4daf7602dd2e772b9c33e78d99ece606d05af311bb413a8" Apr 17 09:15:18.598156 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.598136 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"324ce7296e0b468dd4daf7602dd2e772b9c33e78d99ece606d05af311bb413a8"} err="failed to get container status \"324ce7296e0b468dd4daf7602dd2e772b9c33e78d99ece606d05af311bb413a8\": rpc error: code = NotFound desc = could not find container \"324ce7296e0b468dd4daf7602dd2e772b9c33e78d99ece606d05af311bb413a8\": container with ID starting with 324ce7296e0b468dd4daf7602dd2e772b9c33e78d99ece606d05af311bb413a8 not found: ID does not exist" Apr 17 09:15:18.598156 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.598156 2578 scope.go:117] "RemoveContainer" containerID="67f3aba96ba48696a3e4ed7db58e88192134089d91b0e5b270d968a855daf155" Apr 17 09:15:18.598400 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.598382 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67f3aba96ba48696a3e4ed7db58e88192134089d91b0e5b270d968a855daf155"} err="failed to get container status \"67f3aba96ba48696a3e4ed7db58e88192134089d91b0e5b270d968a855daf155\": rpc error: code = NotFound desc = could not find container \"67f3aba96ba48696a3e4ed7db58e88192134089d91b0e5b270d968a855daf155\": container with ID starting with 67f3aba96ba48696a3e4ed7db58e88192134089d91b0e5b270d968a855daf155 not found: ID does not exist" Apr 17 09:15:18.598445 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.598411 2578 scope.go:117] "RemoveContainer" containerID="bac4e1f2357d95cc5b6737ba8179c717eabcc4214c1ac47a343b42dceb719c61" Apr 17 09:15:18.598640 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.598624 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bac4e1f2357d95cc5b6737ba8179c717eabcc4214c1ac47a343b42dceb719c61"} err="failed to get container status \"bac4e1f2357d95cc5b6737ba8179c717eabcc4214c1ac47a343b42dceb719c61\": rpc error: code = NotFound desc = could not find container \"bac4e1f2357d95cc5b6737ba8179c717eabcc4214c1ac47a343b42dceb719c61\": container with ID starting with bac4e1f2357d95cc5b6737ba8179c717eabcc4214c1ac47a343b42dceb719c61 not found: ID does not exist" Apr 17 09:15:18.598640 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.598640 2578 scope.go:117] "RemoveContainer" containerID="95d2c9f400a1813e68d46aff1d54d2170c6d6212991a3cb1b7babf6395ad9a1a" Apr 17 09:15:18.598860 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.598824 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95d2c9f400a1813e68d46aff1d54d2170c6d6212991a3cb1b7babf6395ad9a1a"} err="failed to get container status \"95d2c9f400a1813e68d46aff1d54d2170c6d6212991a3cb1b7babf6395ad9a1a\": rpc error: code = NotFound desc = could not find container \"95d2c9f400a1813e68d46aff1d54d2170c6d6212991a3cb1b7babf6395ad9a1a\": container with ID starting with 95d2c9f400a1813e68d46aff1d54d2170c6d6212991a3cb1b7babf6395ad9a1a not found: ID does not exist" Apr 17 09:15:18.609170 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.609146 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 09:15:18.609558 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.609540 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d8c0aa7a-a320-46fc-81bb-557c14ad8572" containerName="console" Apr 17 09:15:18.609607 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.609561 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8c0aa7a-a320-46fc-81bb-557c14ad8572" containerName="console" Apr 17 09:15:18.609607 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.609572 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5" containerName="config-reloader" Apr 17 09:15:18.609607 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.609578 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5" containerName="config-reloader" Apr 17 09:15:18.609607 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.609599 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5" containerName="kube-rbac-proxy-web" Apr 17 09:15:18.609607 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.609607 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5" containerName="kube-rbac-proxy-web" Apr 17 09:15:18.609762 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.609616 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5" containerName="kube-rbac-proxy" Apr 17 09:15:18.609762 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.609622 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5" containerName="kube-rbac-proxy" Apr 17 09:15:18.609762 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.609633 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5" containerName="alertmanager" Apr 17 09:15:18.609762 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.609638 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5" containerName="alertmanager" Apr 17 09:15:18.609762 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.609644 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5" containerName="prom-label-proxy" Apr 17 09:15:18.609762 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.609649 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5" containerName="prom-label-proxy" Apr 17 09:15:18.609762 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.609655 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5" containerName="init-config-reloader" Apr 17 09:15:18.609762 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.609661 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5" containerName="init-config-reloader" Apr 17 09:15:18.609762 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.609668 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5" containerName="kube-rbac-proxy-metric" Apr 17 09:15:18.609762 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.609673 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5" containerName="kube-rbac-proxy-metric" Apr 17 09:15:18.609762 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.609721 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5" containerName="alertmanager" Apr 17 09:15:18.609762 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.609728 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5" containerName="config-reloader" Apr 17 09:15:18.609762 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.609734 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5" containerName="kube-rbac-proxy-metric" Apr 17 09:15:18.609762 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.609740 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5" containerName="prom-label-proxy" Apr 17 09:15:18.609762 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.609749 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5" containerName="kube-rbac-proxy-web" Apr 17 09:15:18.609762 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.609757 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5" containerName="kube-rbac-proxy" Apr 17 09:15:18.609762 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.609765 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="d8c0aa7a-a320-46fc-81bb-557c14ad8572" containerName="console" Apr 17 09:15:18.614736 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.614721 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:15:18.617650 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.617629 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-gf2dg\"" Apr 17 09:15:18.617650 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.617646 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 09:15:18.617782 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.617657 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 09:15:18.617782 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.617634 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 09:15:18.617961 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.617945 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 09:15:18.618107 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.618085 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 09:15:18.618205 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.618175 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 09:15:18.618276 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.618204 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 09:15:18.618276 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.618192 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 09:15:18.622372 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.622348 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 09:15:18.627260 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.627242 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 09:15:18.710473 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.710444 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptqg6\" (UniqueName: \"kubernetes.io/projected/feb2874c-8009-4328-a602-437bc8212b5a-kube-api-access-ptqg6\") pod \"alertmanager-main-0\" (UID: \"feb2874c-8009-4328-a602-437bc8212b5a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:15:18.710599 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.710478 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/feb2874c-8009-4328-a602-437bc8212b5a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"feb2874c-8009-4328-a602-437bc8212b5a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:15:18.710599 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.710500 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/feb2874c-8009-4328-a602-437bc8212b5a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"feb2874c-8009-4328-a602-437bc8212b5a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:15:18.710599 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.710581 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/feb2874c-8009-4328-a602-437bc8212b5a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"feb2874c-8009-4328-a602-437bc8212b5a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:15:18.710697 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.710612 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/feb2874c-8009-4328-a602-437bc8212b5a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"feb2874c-8009-4328-a602-437bc8212b5a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:15:18.710697 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.710643 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/feb2874c-8009-4328-a602-437bc8212b5a-config-volume\") pod \"alertmanager-main-0\" (UID: \"feb2874c-8009-4328-a602-437bc8212b5a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:15:18.710697 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.710678 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/feb2874c-8009-4328-a602-437bc8212b5a-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"feb2874c-8009-4328-a602-437bc8212b5a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:15:18.710779 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.710710 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/feb2874c-8009-4328-a602-437bc8212b5a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"feb2874c-8009-4328-a602-437bc8212b5a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:15:18.710779 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.710742 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/feb2874c-8009-4328-a602-437bc8212b5a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"feb2874c-8009-4328-a602-437bc8212b5a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:15:18.710779 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.710772 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/feb2874c-8009-4328-a602-437bc8212b5a-web-config\") pod \"alertmanager-main-0\" (UID: \"feb2874c-8009-4328-a602-437bc8212b5a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:15:18.710889 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.710813 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/feb2874c-8009-4328-a602-437bc8212b5a-config-out\") pod \"alertmanager-main-0\" (UID: \"feb2874c-8009-4328-a602-437bc8212b5a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:15:18.710889 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.710830 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/feb2874c-8009-4328-a602-437bc8212b5a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"feb2874c-8009-4328-a602-437bc8212b5a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:15:18.710889 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.710882 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/feb2874c-8009-4328-a602-437bc8212b5a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"feb2874c-8009-4328-a602-437bc8212b5a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:15:18.812133 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.812066 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/feb2874c-8009-4328-a602-437bc8212b5a-config-out\") pod \"alertmanager-main-0\" (UID: \"feb2874c-8009-4328-a602-437bc8212b5a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:15:18.812133 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.812096 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/feb2874c-8009-4328-a602-437bc8212b5a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"feb2874c-8009-4328-a602-437bc8212b5a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:15:18.812133 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.812115 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/feb2874c-8009-4328-a602-437bc8212b5a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"feb2874c-8009-4328-a602-437bc8212b5a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:15:18.812623 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.812144 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ptqg6\" (UniqueName: \"kubernetes.io/projected/feb2874c-8009-4328-a602-437bc8212b5a-kube-api-access-ptqg6\") pod \"alertmanager-main-0\" (UID: \"feb2874c-8009-4328-a602-437bc8212b5a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:15:18.812623 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.812173 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/feb2874c-8009-4328-a602-437bc8212b5a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"feb2874c-8009-4328-a602-437bc8212b5a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:15:18.812623 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.812195 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/feb2874c-8009-4328-a602-437bc8212b5a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"feb2874c-8009-4328-a602-437bc8212b5a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:15:18.812623 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.812229 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/feb2874c-8009-4328-a602-437bc8212b5a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"feb2874c-8009-4328-a602-437bc8212b5a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:15:18.812623 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.812258 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/feb2874c-8009-4328-a602-437bc8212b5a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"feb2874c-8009-4328-a602-437bc8212b5a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:15:18.812623 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.812588 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/feb2874c-8009-4328-a602-437bc8212b5a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"feb2874c-8009-4328-a602-437bc8212b5a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:15:18.812623 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.812601 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/feb2874c-8009-4328-a602-437bc8212b5a-config-volume\") pod \"alertmanager-main-0\" (UID: \"feb2874c-8009-4328-a602-437bc8212b5a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:15:18.813045 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.812657 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/feb2874c-8009-4328-a602-437bc8212b5a-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"feb2874c-8009-4328-a602-437bc8212b5a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:15:18.813045 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.812710 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/feb2874c-8009-4328-a602-437bc8212b5a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"feb2874c-8009-4328-a602-437bc8212b5a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:15:18.813045 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.812737 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/feb2874c-8009-4328-a602-437bc8212b5a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"feb2874c-8009-4328-a602-437bc8212b5a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:15:18.813045 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.812772 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/feb2874c-8009-4328-a602-437bc8212b5a-web-config\") pod \"alertmanager-main-0\" (UID: \"feb2874c-8009-4328-a602-437bc8212b5a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:15:18.813045 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.813019 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/feb2874c-8009-4328-a602-437bc8212b5a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"feb2874c-8009-4328-a602-437bc8212b5a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:15:18.813294 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.813143 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/feb2874c-8009-4328-a602-437bc8212b5a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"feb2874c-8009-4328-a602-437bc8212b5a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:15:18.815292 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.815247 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/feb2874c-8009-4328-a602-437bc8212b5a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"feb2874c-8009-4328-a602-437bc8212b5a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:15:18.815292 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.815280 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/feb2874c-8009-4328-a602-437bc8212b5a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"feb2874c-8009-4328-a602-437bc8212b5a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:15:18.815806 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.815783 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/feb2874c-8009-4328-a602-437bc8212b5a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"feb2874c-8009-4328-a602-437bc8212b5a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:15:18.815968 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.815950 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/feb2874c-8009-4328-a602-437bc8212b5a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"feb2874c-8009-4328-a602-437bc8212b5a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:15:18.816049 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.816010 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/feb2874c-8009-4328-a602-437bc8212b5a-config-volume\") pod \"alertmanager-main-0\" (UID: \"feb2874c-8009-4328-a602-437bc8212b5a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:15:18.816107 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.816081 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/feb2874c-8009-4328-a602-437bc8212b5a-web-config\") pod \"alertmanager-main-0\" (UID: \"feb2874c-8009-4328-a602-437bc8212b5a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:15:18.816107 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.816094 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/feb2874c-8009-4328-a602-437bc8212b5a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"feb2874c-8009-4328-a602-437bc8212b5a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:15:18.816284 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.816268 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/feb2874c-8009-4328-a602-437bc8212b5a-config-out\") pod \"alertmanager-main-0\" (UID: \"feb2874c-8009-4328-a602-437bc8212b5a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:15:18.816914 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.816898 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/feb2874c-8009-4328-a602-437bc8212b5a-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"feb2874c-8009-4328-a602-437bc8212b5a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:15:18.822288 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.822268 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptqg6\" (UniqueName: \"kubernetes.io/projected/feb2874c-8009-4328-a602-437bc8212b5a-kube-api-access-ptqg6\") pod \"alertmanager-main-0\" (UID: \"feb2874c-8009-4328-a602-437bc8212b5a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:15:18.923455 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:18.923422 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 09:15:19.049778 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:19.049756 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 09:15:19.051721 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:15:19.051699 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfeb2874c_8009_4328_a602_437bc8212b5a.slice/crio-9783be2253e521e287641ff898fb5ce9df54976e94a221eb7b763eaf155c4719 WatchSource:0}: Error finding container 9783be2253e521e287641ff898fb5ce9df54976e94a221eb7b763eaf155c4719: Status 404 returned error can't find the container with id 9783be2253e521e287641ff898fb5ce9df54976e94a221eb7b763eaf155c4719 Apr 17 09:15:19.554991 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:19.554960 2578 generic.go:358] "Generic (PLEG): container finished" podID="feb2874c-8009-4328-a602-437bc8212b5a" containerID="9789b6726326c866f6f725eee1a030c74d8f093b9c6b4928df2837d45dbcd8f9" exitCode=0 Apr 17 09:15:19.555134 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:19.555032 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"feb2874c-8009-4328-a602-437bc8212b5a","Type":"ContainerDied","Data":"9789b6726326c866f6f725eee1a030c74d8f093b9c6b4928df2837d45dbcd8f9"} Apr 17 09:15:19.555134 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:19.555054 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"feb2874c-8009-4328-a602-437bc8212b5a","Type":"ContainerStarted","Data":"9783be2253e521e287641ff898fb5ce9df54976e94a221eb7b763eaf155c4719"} Apr 17 09:15:19.659199 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:19.659162 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5" path="/var/lib/kubelet/pods/ddfa3f44-ca1e-44c1-9dd2-8d20bc0a1bc5/volumes" Apr 17 09:15:20.561371 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:20.561339 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"feb2874c-8009-4328-a602-437bc8212b5a","Type":"ContainerStarted","Data":"3b30b724a2e7399b3cbe582710fd24b418821625a6d131ec9448fbf9ebcecb5f"} Apr 17 09:15:20.561371 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:20.561373 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"feb2874c-8009-4328-a602-437bc8212b5a","Type":"ContainerStarted","Data":"8146bbc591304d4a782ec1202b578d4807cdebab5429b9375eee87dcca647fba"} Apr 17 09:15:20.561772 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:20.561383 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"feb2874c-8009-4328-a602-437bc8212b5a","Type":"ContainerStarted","Data":"065699a2e416cb6b90316ff620a1ee78bb558890ce36d3cc0455f82c25d090e7"} Apr 17 09:15:20.561772 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:20.561392 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"feb2874c-8009-4328-a602-437bc8212b5a","Type":"ContainerStarted","Data":"bb9593817476c4070bcfb44d2376121f01399c9c3e4fd75a455424fe599c6e5d"} Apr 17 09:15:20.561772 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:20.561412 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"feb2874c-8009-4328-a602-437bc8212b5a","Type":"ContainerStarted","Data":"ddc03215a3ad6bfe6f56738f407b2d39ee41e15473d0302b96c5f6293757def1"} Apr 17 09:15:20.561772 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:20.561420 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"feb2874c-8009-4328-a602-437bc8212b5a","Type":"ContainerStarted","Data":"75990039c62161a0089a7fd8a54103df66c05045af1d96c45a5f4a74d3540b72"} Apr 17 09:15:20.570892 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:20.570865 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-7bfb545dc8-4g5kk"] Apr 17 09:15:20.574239 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:20.574220 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-7bfb545dc8-4g5kk" Apr 17 09:15:20.576848 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:20.576815 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 17 09:15:20.576948 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:20.576814 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 17 09:15:20.577010 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:20.576959 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 17 09:15:20.577010 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:20.576982 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 17 09:15:20.577112 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:20.576985 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-xb478\"" Apr 17 09:15:20.577211 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:20.577198 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 17 09:15:20.582687 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:20.582663 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 17 09:15:20.586470 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:20.586451 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-7bfb545dc8-4g5kk"] Apr 17 09:15:20.595713 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:20.594967 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.594952437 podStartE2EDuration="2.594952437s" podCreationTimestamp="2026-04-17 09:15:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 09:15:20.594310185 +0000 UTC m=+245.496367567" watchObservedRunningTime="2026-04-17 09:15:20.594952437 +0000 UTC m=+245.497009815" Apr 17 09:15:20.728353 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:20.728304 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/cdb69cfc-28ac-4182-a68a-432310e1dad2-telemeter-client-tls\") pod \"telemeter-client-7bfb545dc8-4g5kk\" (UID: \"cdb69cfc-28ac-4182-a68a-432310e1dad2\") " pod="openshift-monitoring/telemeter-client-7bfb545dc8-4g5kk" Apr 17 09:15:20.728516 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:20.728366 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdb69cfc-28ac-4182-a68a-432310e1dad2-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7bfb545dc8-4g5kk\" (UID: \"cdb69cfc-28ac-4182-a68a-432310e1dad2\") " pod="openshift-monitoring/telemeter-client-7bfb545dc8-4g5kk" Apr 17 09:15:20.728604 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:20.728506 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cdb69cfc-28ac-4182-a68a-432310e1dad2-metrics-client-ca\") pod \"telemeter-client-7bfb545dc8-4g5kk\" (UID: \"cdb69cfc-28ac-4182-a68a-432310e1dad2\") " pod="openshift-monitoring/telemeter-client-7bfb545dc8-4g5kk" Apr 17 09:15:20.728604 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:20.728578 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cdb69cfc-28ac-4182-a68a-432310e1dad2-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7bfb545dc8-4g5kk\" (UID: \"cdb69cfc-28ac-4182-a68a-432310e1dad2\") " pod="openshift-monitoring/telemeter-client-7bfb545dc8-4g5kk" Apr 17 09:15:20.728708 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:20.728657 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdb69cfc-28ac-4182-a68a-432310e1dad2-serving-certs-ca-bundle\") pod \"telemeter-client-7bfb545dc8-4g5kk\" (UID: \"cdb69cfc-28ac-4182-a68a-432310e1dad2\") " pod="openshift-monitoring/telemeter-client-7bfb545dc8-4g5kk" Apr 17 09:15:20.728708 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:20.728690 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/cdb69cfc-28ac-4182-a68a-432310e1dad2-secret-telemeter-client\") pod \"telemeter-client-7bfb545dc8-4g5kk\" (UID: \"cdb69cfc-28ac-4182-a68a-432310e1dad2\") " pod="openshift-monitoring/telemeter-client-7bfb545dc8-4g5kk" Apr 17 09:15:20.728906 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:20.728882 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/cdb69cfc-28ac-4182-a68a-432310e1dad2-federate-client-tls\") pod \"telemeter-client-7bfb545dc8-4g5kk\" (UID: \"cdb69cfc-28ac-4182-a68a-432310e1dad2\") " pod="openshift-monitoring/telemeter-client-7bfb545dc8-4g5kk" Apr 17 09:15:20.729031 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:20.728921 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pf6v\" (UniqueName: \"kubernetes.io/projected/cdb69cfc-28ac-4182-a68a-432310e1dad2-kube-api-access-9pf6v\") pod \"telemeter-client-7bfb545dc8-4g5kk\" (UID: \"cdb69cfc-28ac-4182-a68a-432310e1dad2\") " pod="openshift-monitoring/telemeter-client-7bfb545dc8-4g5kk" Apr 17 09:15:20.829454 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:20.829372 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdb69cfc-28ac-4182-a68a-432310e1dad2-serving-certs-ca-bundle\") pod \"telemeter-client-7bfb545dc8-4g5kk\" (UID: \"cdb69cfc-28ac-4182-a68a-432310e1dad2\") " pod="openshift-monitoring/telemeter-client-7bfb545dc8-4g5kk" Apr 17 09:15:20.829454 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:20.829414 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/cdb69cfc-28ac-4182-a68a-432310e1dad2-secret-telemeter-client\") pod \"telemeter-client-7bfb545dc8-4g5kk\" (UID: \"cdb69cfc-28ac-4182-a68a-432310e1dad2\") " pod="openshift-monitoring/telemeter-client-7bfb545dc8-4g5kk" Apr 17 09:15:20.829661 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:20.829567 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/cdb69cfc-28ac-4182-a68a-432310e1dad2-federate-client-tls\") pod \"telemeter-client-7bfb545dc8-4g5kk\" (UID: \"cdb69cfc-28ac-4182-a68a-432310e1dad2\") " pod="openshift-monitoring/telemeter-client-7bfb545dc8-4g5kk" Apr 17 09:15:20.829661 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:20.829595 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9pf6v\" (UniqueName: \"kubernetes.io/projected/cdb69cfc-28ac-4182-a68a-432310e1dad2-kube-api-access-9pf6v\") pod \"telemeter-client-7bfb545dc8-4g5kk\" (UID: \"cdb69cfc-28ac-4182-a68a-432310e1dad2\") " pod="openshift-monitoring/telemeter-client-7bfb545dc8-4g5kk" Apr 17 09:15:20.829661 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:20.829655 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/cdb69cfc-28ac-4182-a68a-432310e1dad2-telemeter-client-tls\") pod \"telemeter-client-7bfb545dc8-4g5kk\" (UID: \"cdb69cfc-28ac-4182-a68a-432310e1dad2\") " pod="openshift-monitoring/telemeter-client-7bfb545dc8-4g5kk" Apr 17 09:15:20.829807 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:20.829690 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdb69cfc-28ac-4182-a68a-432310e1dad2-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7bfb545dc8-4g5kk\" (UID: \"cdb69cfc-28ac-4182-a68a-432310e1dad2\") " pod="openshift-monitoring/telemeter-client-7bfb545dc8-4g5kk" Apr 17 09:15:20.829807 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:20.829751 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cdb69cfc-28ac-4182-a68a-432310e1dad2-metrics-client-ca\") pod \"telemeter-client-7bfb545dc8-4g5kk\" (UID: \"cdb69cfc-28ac-4182-a68a-432310e1dad2\") " pod="openshift-monitoring/telemeter-client-7bfb545dc8-4g5kk" Apr 17 09:15:20.829807 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:20.829799 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cdb69cfc-28ac-4182-a68a-432310e1dad2-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7bfb545dc8-4g5kk\" (UID: \"cdb69cfc-28ac-4182-a68a-432310e1dad2\") " pod="openshift-monitoring/telemeter-client-7bfb545dc8-4g5kk" Apr 17 09:15:20.830125 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:20.830101 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdb69cfc-28ac-4182-a68a-432310e1dad2-serving-certs-ca-bundle\") pod \"telemeter-client-7bfb545dc8-4g5kk\" (UID: \"cdb69cfc-28ac-4182-a68a-432310e1dad2\") " pod="openshift-monitoring/telemeter-client-7bfb545dc8-4g5kk" Apr 17 09:15:20.830485 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:20.830462 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cdb69cfc-28ac-4182-a68a-432310e1dad2-metrics-client-ca\") pod \"telemeter-client-7bfb545dc8-4g5kk\" (UID: \"cdb69cfc-28ac-4182-a68a-432310e1dad2\") " pod="openshift-monitoring/telemeter-client-7bfb545dc8-4g5kk" Apr 17 09:15:20.830627 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:20.830605 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdb69cfc-28ac-4182-a68a-432310e1dad2-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7bfb545dc8-4g5kk\" (UID: \"cdb69cfc-28ac-4182-a68a-432310e1dad2\") " pod="openshift-monitoring/telemeter-client-7bfb545dc8-4g5kk" Apr 17 09:15:20.832195 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:20.832165 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cdb69cfc-28ac-4182-a68a-432310e1dad2-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7bfb545dc8-4g5kk\" (UID: \"cdb69cfc-28ac-4182-a68a-432310e1dad2\") " pod="openshift-monitoring/telemeter-client-7bfb545dc8-4g5kk" Apr 17 09:15:20.832277 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:20.832175 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/cdb69cfc-28ac-4182-a68a-432310e1dad2-federate-client-tls\") pod \"telemeter-client-7bfb545dc8-4g5kk\" (UID: \"cdb69cfc-28ac-4182-a68a-432310e1dad2\") " pod="openshift-monitoring/telemeter-client-7bfb545dc8-4g5kk" Apr 17 09:15:20.832363 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:20.832347 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/cdb69cfc-28ac-4182-a68a-432310e1dad2-secret-telemeter-client\") pod \"telemeter-client-7bfb545dc8-4g5kk\" (UID: \"cdb69cfc-28ac-4182-a68a-432310e1dad2\") " pod="openshift-monitoring/telemeter-client-7bfb545dc8-4g5kk" Apr 17 09:15:20.832399 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:20.832372 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/cdb69cfc-28ac-4182-a68a-432310e1dad2-telemeter-client-tls\") pod \"telemeter-client-7bfb545dc8-4g5kk\" (UID: \"cdb69cfc-28ac-4182-a68a-432310e1dad2\") " pod="openshift-monitoring/telemeter-client-7bfb545dc8-4g5kk" Apr 17 09:15:20.837707 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:20.837691 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pf6v\" (UniqueName: \"kubernetes.io/projected/cdb69cfc-28ac-4182-a68a-432310e1dad2-kube-api-access-9pf6v\") pod \"telemeter-client-7bfb545dc8-4g5kk\" (UID: \"cdb69cfc-28ac-4182-a68a-432310e1dad2\") " pod="openshift-monitoring/telemeter-client-7bfb545dc8-4g5kk" Apr 17 09:15:20.883751 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:20.883727 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-7bfb545dc8-4g5kk" Apr 17 09:15:21.003496 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:21.003451 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-7bfb545dc8-4g5kk"] Apr 17 09:15:21.005380 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:15:21.005351 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdb69cfc_28ac_4182_a68a_432310e1dad2.slice/crio-1cdb2d65c3c64cdda6426ce0499f231d56bde6a066cff9ff2cd7987cd2db8d3b WatchSource:0}: Error finding container 1cdb2d65c3c64cdda6426ce0499f231d56bde6a066cff9ff2cd7987cd2db8d3b: Status 404 returned error can't find the container with id 1cdb2d65c3c64cdda6426ce0499f231d56bde6a066cff9ff2cd7987cd2db8d3b Apr 17 09:15:21.566812 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:21.566770 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7bfb545dc8-4g5kk" event={"ID":"cdb69cfc-28ac-4182-a68a-432310e1dad2","Type":"ContainerStarted","Data":"1cdb2d65c3c64cdda6426ce0499f231d56bde6a066cff9ff2cd7987cd2db8d3b"} Apr 17 09:15:23.575909 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:23.575862 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7bfb545dc8-4g5kk" event={"ID":"cdb69cfc-28ac-4182-a68a-432310e1dad2","Type":"ContainerStarted","Data":"0d866b5627ebcbb5bfc872d338953f73f404b3a1cd5551a5894a27cb6b57604d"} Apr 17 09:15:23.575909 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:23.575914 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7bfb545dc8-4g5kk" event={"ID":"cdb69cfc-28ac-4182-a68a-432310e1dad2","Type":"ContainerStarted","Data":"02e706e902d265e2c9ee586f530b29443f1cc1c8bdf6a0e4bee9e06341691aaf"} Apr 17 09:15:23.576276 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:23.575929 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7bfb545dc8-4g5kk" event={"ID":"cdb69cfc-28ac-4182-a68a-432310e1dad2","Type":"ContainerStarted","Data":"fc18ac393504a7f1eceb0589cd06b0786c9a5af0038abdd1efe0d7149b21c20a"} Apr 17 09:15:23.599789 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:23.599737 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-7bfb545dc8-4g5kk" podStartSLOduration=1.688017726 podStartE2EDuration="3.599723518s" podCreationTimestamp="2026-04-17 09:15:20 +0000 UTC" firstStartedPulling="2026-04-17 09:15:21.007061412 +0000 UTC m=+245.909118775" lastFinishedPulling="2026-04-17 09:15:22.918767208 +0000 UTC m=+247.820824567" observedRunningTime="2026-04-17 09:15:23.59812274 +0000 UTC m=+248.500180120" watchObservedRunningTime="2026-04-17 09:15:23.599723518 +0000 UTC m=+248.501780944" Apr 17 09:15:24.340334 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:24.340300 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-54559dc8b-zwnqp"] Apr 17 09:15:24.343743 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:24.343719 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54559dc8b-zwnqp" Apr 17 09:15:24.355240 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:24.355216 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-54559dc8b-zwnqp"] Apr 17 09:15:24.463140 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:24.463111 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/88b32e60-0575-404c-a7ee-9b025ad967da-console-oauth-config\") pod \"console-54559dc8b-zwnqp\" (UID: \"88b32e60-0575-404c-a7ee-9b025ad967da\") " pod="openshift-console/console-54559dc8b-zwnqp" Apr 17 09:15:24.463257 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:24.463149 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/88b32e60-0575-404c-a7ee-9b025ad967da-service-ca\") pod \"console-54559dc8b-zwnqp\" (UID: \"88b32e60-0575-404c-a7ee-9b025ad967da\") " pod="openshift-console/console-54559dc8b-zwnqp" Apr 17 09:15:24.463257 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:24.463183 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/88b32e60-0575-404c-a7ee-9b025ad967da-oauth-serving-cert\") pod \"console-54559dc8b-zwnqp\" (UID: \"88b32e60-0575-404c-a7ee-9b025ad967da\") " pod="openshift-console/console-54559dc8b-zwnqp" Apr 17 09:15:24.463356 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:24.463252 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/88b32e60-0575-404c-a7ee-9b025ad967da-console-config\") pod \"console-54559dc8b-zwnqp\" (UID: \"88b32e60-0575-404c-a7ee-9b025ad967da\") " pod="openshift-console/console-54559dc8b-zwnqp" Apr 17 09:15:24.463356 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:24.463285 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vc5h\" (UniqueName: \"kubernetes.io/projected/88b32e60-0575-404c-a7ee-9b025ad967da-kube-api-access-6vc5h\") pod \"console-54559dc8b-zwnqp\" (UID: \"88b32e60-0575-404c-a7ee-9b025ad967da\") " pod="openshift-console/console-54559dc8b-zwnqp" Apr 17 09:15:24.463356 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:24.463316 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/88b32e60-0575-404c-a7ee-9b025ad967da-console-serving-cert\") pod \"console-54559dc8b-zwnqp\" (UID: \"88b32e60-0575-404c-a7ee-9b025ad967da\") " pod="openshift-console/console-54559dc8b-zwnqp" Apr 17 09:15:24.463356 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:24.463330 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88b32e60-0575-404c-a7ee-9b025ad967da-trusted-ca-bundle\") pod \"console-54559dc8b-zwnqp\" (UID: \"88b32e60-0575-404c-a7ee-9b025ad967da\") " pod="openshift-console/console-54559dc8b-zwnqp" Apr 17 09:15:24.564398 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:24.564362 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/88b32e60-0575-404c-a7ee-9b025ad967da-console-config\") pod \"console-54559dc8b-zwnqp\" (UID: \"88b32e60-0575-404c-a7ee-9b025ad967da\") " pod="openshift-console/console-54559dc8b-zwnqp" Apr 17 09:15:24.564398 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:24.564401 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6vc5h\" (UniqueName: \"kubernetes.io/projected/88b32e60-0575-404c-a7ee-9b025ad967da-kube-api-access-6vc5h\") pod \"console-54559dc8b-zwnqp\" (UID: \"88b32e60-0575-404c-a7ee-9b025ad967da\") " pod="openshift-console/console-54559dc8b-zwnqp" Apr 17 09:15:24.564578 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:24.564427 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/88b32e60-0575-404c-a7ee-9b025ad967da-console-serving-cert\") pod \"console-54559dc8b-zwnqp\" (UID: \"88b32e60-0575-404c-a7ee-9b025ad967da\") " pod="openshift-console/console-54559dc8b-zwnqp" Apr 17 09:15:24.564578 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:24.564444 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88b32e60-0575-404c-a7ee-9b025ad967da-trusted-ca-bundle\") pod \"console-54559dc8b-zwnqp\" (UID: \"88b32e60-0575-404c-a7ee-9b025ad967da\") " pod="openshift-console/console-54559dc8b-zwnqp" Apr 17 09:15:24.564578 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:24.564489 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/88b32e60-0575-404c-a7ee-9b025ad967da-console-oauth-config\") pod \"console-54559dc8b-zwnqp\" (UID: \"88b32e60-0575-404c-a7ee-9b025ad967da\") " pod="openshift-console/console-54559dc8b-zwnqp" Apr 17 09:15:24.564578 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:24.564510 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/88b32e60-0575-404c-a7ee-9b025ad967da-service-ca\") pod \"console-54559dc8b-zwnqp\" (UID: \"88b32e60-0575-404c-a7ee-9b025ad967da\") " pod="openshift-console/console-54559dc8b-zwnqp" Apr 17 09:15:24.564578 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:24.564525 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/88b32e60-0575-404c-a7ee-9b025ad967da-oauth-serving-cert\") pod \"console-54559dc8b-zwnqp\" (UID: \"88b32e60-0575-404c-a7ee-9b025ad967da\") " pod="openshift-console/console-54559dc8b-zwnqp" Apr 17 09:15:24.565144 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:24.565118 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/88b32e60-0575-404c-a7ee-9b025ad967da-console-config\") pod \"console-54559dc8b-zwnqp\" (UID: \"88b32e60-0575-404c-a7ee-9b025ad967da\") " pod="openshift-console/console-54559dc8b-zwnqp" Apr 17 09:15:24.565228 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:24.565188 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/88b32e60-0575-404c-a7ee-9b025ad967da-oauth-serving-cert\") pod \"console-54559dc8b-zwnqp\" (UID: \"88b32e60-0575-404c-a7ee-9b025ad967da\") " pod="openshift-console/console-54559dc8b-zwnqp" Apr 17 09:15:24.565228 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:24.565209 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/88b32e60-0575-404c-a7ee-9b025ad967da-service-ca\") pod \"console-54559dc8b-zwnqp\" (UID: \"88b32e60-0575-404c-a7ee-9b025ad967da\") " pod="openshift-console/console-54559dc8b-zwnqp" Apr 17 09:15:24.565427 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:24.565407 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88b32e60-0575-404c-a7ee-9b025ad967da-trusted-ca-bundle\") pod \"console-54559dc8b-zwnqp\" (UID: \"88b32e60-0575-404c-a7ee-9b025ad967da\") " pod="openshift-console/console-54559dc8b-zwnqp" Apr 17 09:15:24.566771 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:24.566742 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/88b32e60-0575-404c-a7ee-9b025ad967da-console-oauth-config\") pod \"console-54559dc8b-zwnqp\" (UID: \"88b32e60-0575-404c-a7ee-9b025ad967da\") " pod="openshift-console/console-54559dc8b-zwnqp" Apr 17 09:15:24.566919 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:24.566899 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/88b32e60-0575-404c-a7ee-9b025ad967da-console-serving-cert\") pod \"console-54559dc8b-zwnqp\" (UID: \"88b32e60-0575-404c-a7ee-9b025ad967da\") " pod="openshift-console/console-54559dc8b-zwnqp" Apr 17 09:15:24.572239 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:24.572215 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vc5h\" (UniqueName: \"kubernetes.io/projected/88b32e60-0575-404c-a7ee-9b025ad967da-kube-api-access-6vc5h\") pod \"console-54559dc8b-zwnqp\" (UID: \"88b32e60-0575-404c-a7ee-9b025ad967da\") " pod="openshift-console/console-54559dc8b-zwnqp" Apr 17 09:15:24.652732 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:24.652680 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54559dc8b-zwnqp" Apr 17 09:15:24.792532 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:24.792501 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-54559dc8b-zwnqp"] Apr 17 09:15:24.795664 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:15:24.795638 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88b32e60_0575_404c_a7ee_9b025ad967da.slice/crio-6b9343e802bf8f1cfa1d6b4ab4bcaab0b55c6dbaa8d380f62619b91fe1cc0da2 WatchSource:0}: Error finding container 6b9343e802bf8f1cfa1d6b4ab4bcaab0b55c6dbaa8d380f62619b91fe1cc0da2: Status 404 returned error can't find the container with id 6b9343e802bf8f1cfa1d6b4ab4bcaab0b55c6dbaa8d380f62619b91fe1cc0da2 Apr 17 09:15:25.583527 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:25.583496 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54559dc8b-zwnqp" event={"ID":"88b32e60-0575-404c-a7ee-9b025ad967da","Type":"ContainerStarted","Data":"232c6d13389d6d1c222fc104e7a03fafb2d84cabccf90391a0df4441bb4ed00e"} Apr 17 09:15:25.583527 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:25.583528 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54559dc8b-zwnqp" event={"ID":"88b32e60-0575-404c-a7ee-9b025ad967da","Type":"ContainerStarted","Data":"6b9343e802bf8f1cfa1d6b4ab4bcaab0b55c6dbaa8d380f62619b91fe1cc0da2"} Apr 17 09:15:25.601100 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:25.601036 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-54559dc8b-zwnqp" podStartSLOduration=1.601020533 podStartE2EDuration="1.601020533s" podCreationTimestamp="2026-04-17 09:15:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 09:15:25.600175557 +0000 UTC m=+250.502232937" watchObservedRunningTime="2026-04-17 09:15:25.601020533 +0000 UTC m=+250.503077914" Apr 17 09:15:26.381830 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:26.381800 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea2ee429-d7fa-4703-99bd-5d963ebab30c-metrics-certs\") pod \"network-metrics-daemon-4h6v9\" (UID: \"ea2ee429-d7fa-4703-99bd-5d963ebab30c\") " pod="openshift-multus/network-metrics-daemon-4h6v9" Apr 17 09:15:26.384074 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:26.384050 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea2ee429-d7fa-4703-99bd-5d963ebab30c-metrics-certs\") pod \"network-metrics-daemon-4h6v9\" (UID: \"ea2ee429-d7fa-4703-99bd-5d963ebab30c\") " pod="openshift-multus/network-metrics-daemon-4h6v9" Apr 17 09:15:26.558635 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:26.558611 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-wbqzc\"" Apr 17 09:15:26.565958 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:26.565939 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4h6v9" Apr 17 09:15:26.680512 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:26.680479 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4h6v9"] Apr 17 09:15:26.683499 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:15:26.683464 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea2ee429_d7fa_4703_99bd_5d963ebab30c.slice/crio-0e0c97aea9ad48a6f17f7ab9540c1a98c43aa593cdb3b088f2a8f7b5901e405f WatchSource:0}: Error finding container 0e0c97aea9ad48a6f17f7ab9540c1a98c43aa593cdb3b088f2a8f7b5901e405f: Status 404 returned error can't find the container with id 0e0c97aea9ad48a6f17f7ab9540c1a98c43aa593cdb3b088f2a8f7b5901e405f Apr 17 09:15:27.590908 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:27.590870 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4h6v9" event={"ID":"ea2ee429-d7fa-4703-99bd-5d963ebab30c","Type":"ContainerStarted","Data":"0e0c97aea9ad48a6f17f7ab9540c1a98c43aa593cdb3b088f2a8f7b5901e405f"} Apr 17 09:15:28.595645 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:28.595608 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4h6v9" event={"ID":"ea2ee429-d7fa-4703-99bd-5d963ebab30c","Type":"ContainerStarted","Data":"771e31b5a6adf56f34c6ad1eab85fdc6eda6a42633072166d5eb2fa98df5c818"} Apr 17 09:15:28.595645 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:28.595645 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4h6v9" event={"ID":"ea2ee429-d7fa-4703-99bd-5d963ebab30c","Type":"ContainerStarted","Data":"88f3c4ab0fe8b752b642a529e171fe2596a1fb002e2b494faca5ba3b0fd73ad4"} Apr 17 09:15:28.612808 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:28.612399 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-4h6v9" podStartSLOduration=252.591133831 podStartE2EDuration="4m13.612380289s" podCreationTimestamp="2026-04-17 09:11:15 +0000 UTC" firstStartedPulling="2026-04-17 09:15:26.685540392 +0000 UTC m=+251.587597757" lastFinishedPulling="2026-04-17 09:15:27.706786853 +0000 UTC m=+252.608844215" observedRunningTime="2026-04-17 09:15:28.612183091 +0000 UTC m=+253.514240472" watchObservedRunningTime="2026-04-17 09:15:28.612380289 +0000 UTC m=+253.514437671" Apr 17 09:15:34.653823 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:34.653751 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-54559dc8b-zwnqp" Apr 17 09:15:34.654194 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:34.653909 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-54559dc8b-zwnqp" Apr 17 09:15:34.658188 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:34.658168 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-54559dc8b-zwnqp" Apr 17 09:15:35.621915 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:35.621884 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-54559dc8b-zwnqp" Apr 17 09:15:35.667725 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:15:35.667695 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6b8d56cc57-9m2lt"] Apr 17 09:16:00.686466 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:16:00.686413 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6b8d56cc57-9m2lt" podUID="e66c64fc-e28a-4446-9765-074be13471cf" containerName="console" containerID="cri-o://b638f48380d218d0aea67953408e6f70defbdbdc28e42d57d30e83f7a0a0f172" gracePeriod=15 Apr 17 09:16:00.923252 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:16:00.923231 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6b8d56cc57-9m2lt_e66c64fc-e28a-4446-9765-074be13471cf/console/0.log" Apr 17 09:16:00.923361 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:16:00.923285 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b8d56cc57-9m2lt" Apr 17 09:16:01.037725 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:16:01.037692 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e66c64fc-e28a-4446-9765-074be13471cf-console-config\") pod \"e66c64fc-e28a-4446-9765-074be13471cf\" (UID: \"e66c64fc-e28a-4446-9765-074be13471cf\") " Apr 17 09:16:01.037895 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:16:01.037737 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e66c64fc-e28a-4446-9765-074be13471cf-service-ca\") pod \"e66c64fc-e28a-4446-9765-074be13471cf\" (UID: \"e66c64fc-e28a-4446-9765-074be13471cf\") " Apr 17 09:16:01.037895 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:16:01.037762 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e66c64fc-e28a-4446-9765-074be13471cf-console-serving-cert\") pod \"e66c64fc-e28a-4446-9765-074be13471cf\" (UID: \"e66c64fc-e28a-4446-9765-074be13471cf\") " Apr 17 09:16:01.037895 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:16:01.037788 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e66c64fc-e28a-4446-9765-074be13471cf-trusted-ca-bundle\") pod \"e66c64fc-e28a-4446-9765-074be13471cf\" (UID: \"e66c64fc-e28a-4446-9765-074be13471cf\") " Apr 17 09:16:01.037895 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:16:01.037816 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e66c64fc-e28a-4446-9765-074be13471cf-console-oauth-config\") pod \"e66c64fc-e28a-4446-9765-074be13471cf\" (UID: \"e66c64fc-e28a-4446-9765-074be13471cf\") " Apr 17 09:16:01.038086 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:16:01.037921 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dv9w4\" (UniqueName: \"kubernetes.io/projected/e66c64fc-e28a-4446-9765-074be13471cf-kube-api-access-dv9w4\") pod \"e66c64fc-e28a-4446-9765-074be13471cf\" (UID: \"e66c64fc-e28a-4446-9765-074be13471cf\") " Apr 17 09:16:01.038086 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:16:01.037968 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e66c64fc-e28a-4446-9765-074be13471cf-oauth-serving-cert\") pod \"e66c64fc-e28a-4446-9765-074be13471cf\" (UID: \"e66c64fc-e28a-4446-9765-074be13471cf\") " Apr 17 09:16:01.038256 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:16:01.038231 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e66c64fc-e28a-4446-9765-074be13471cf-service-ca" (OuterVolumeSpecName: "service-ca") pod "e66c64fc-e28a-4446-9765-074be13471cf" (UID: "e66c64fc-e28a-4446-9765-074be13471cf"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 09:16:01.038316 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:16:01.038275 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e66c64fc-e28a-4446-9765-074be13471cf-console-config" (OuterVolumeSpecName: "console-config") pod "e66c64fc-e28a-4446-9765-074be13471cf" (UID: "e66c64fc-e28a-4446-9765-074be13471cf"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 09:16:01.038432 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:16:01.038371 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e66c64fc-e28a-4446-9765-074be13471cf-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e66c64fc-e28a-4446-9765-074be13471cf" (UID: "e66c64fc-e28a-4446-9765-074be13471cf"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 09:16:01.038577 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:16:01.038551 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e66c64fc-e28a-4446-9765-074be13471cf-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e66c64fc-e28a-4446-9765-074be13471cf" (UID: "e66c64fc-e28a-4446-9765-074be13471cf"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 09:16:01.040143 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:16:01.040122 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e66c64fc-e28a-4446-9765-074be13471cf-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e66c64fc-e28a-4446-9765-074be13471cf" (UID: "e66c64fc-e28a-4446-9765-074be13471cf"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 09:16:01.040417 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:16:01.040397 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e66c64fc-e28a-4446-9765-074be13471cf-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e66c64fc-e28a-4446-9765-074be13471cf" (UID: "e66c64fc-e28a-4446-9765-074be13471cf"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 09:16:01.040482 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:16:01.040438 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e66c64fc-e28a-4446-9765-074be13471cf-kube-api-access-dv9w4" (OuterVolumeSpecName: "kube-api-access-dv9w4") pod "e66c64fc-e28a-4446-9765-074be13471cf" (UID: "e66c64fc-e28a-4446-9765-074be13471cf"). InnerVolumeSpecName "kube-api-access-dv9w4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 09:16:01.139572 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:16:01.139537 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e66c64fc-e28a-4446-9765-074be13471cf-oauth-serving-cert\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:16:01.139572 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:16:01.139561 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e66c64fc-e28a-4446-9765-074be13471cf-console-config\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:16:01.139572 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:16:01.139572 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e66c64fc-e28a-4446-9765-074be13471cf-service-ca\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:16:01.139572 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:16:01.139581 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e66c64fc-e28a-4446-9765-074be13471cf-console-serving-cert\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:16:01.139894 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:16:01.139590 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e66c64fc-e28a-4446-9765-074be13471cf-trusted-ca-bundle\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:16:01.139894 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:16:01.139598 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e66c64fc-e28a-4446-9765-074be13471cf-console-oauth-config\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:16:01.139894 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:16:01.139607 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dv9w4\" (UniqueName: \"kubernetes.io/projected/e66c64fc-e28a-4446-9765-074be13471cf-kube-api-access-dv9w4\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:16:01.691333 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:16:01.691306 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6b8d56cc57-9m2lt_e66c64fc-e28a-4446-9765-074be13471cf/console/0.log" Apr 17 09:16:01.691670 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:16:01.691345 2578 generic.go:358] "Generic (PLEG): container finished" podID="e66c64fc-e28a-4446-9765-074be13471cf" containerID="b638f48380d218d0aea67953408e6f70defbdbdc28e42d57d30e83f7a0a0f172" exitCode=2 Apr 17 09:16:01.691670 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:16:01.691425 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b8d56cc57-9m2lt" event={"ID":"e66c64fc-e28a-4446-9765-074be13471cf","Type":"ContainerDied","Data":"b638f48380d218d0aea67953408e6f70defbdbdc28e42d57d30e83f7a0a0f172"} Apr 17 09:16:01.691670 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:16:01.691432 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b8d56cc57-9m2lt" Apr 17 09:16:01.691670 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:16:01.691448 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b8d56cc57-9m2lt" event={"ID":"e66c64fc-e28a-4446-9765-074be13471cf","Type":"ContainerDied","Data":"dc6ee903d069beb0868fbcd3e4500e24082459bc47cc678d3e8225872b6e0906"} Apr 17 09:16:01.691670 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:16:01.691463 2578 scope.go:117] "RemoveContainer" containerID="b638f48380d218d0aea67953408e6f70defbdbdc28e42d57d30e83f7a0a0f172" Apr 17 09:16:01.699014 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:16:01.698998 2578 scope.go:117] "RemoveContainer" containerID="b638f48380d218d0aea67953408e6f70defbdbdc28e42d57d30e83f7a0a0f172" Apr 17 09:16:01.699259 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:16:01.699242 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b638f48380d218d0aea67953408e6f70defbdbdc28e42d57d30e83f7a0a0f172\": container with ID starting with b638f48380d218d0aea67953408e6f70defbdbdc28e42d57d30e83f7a0a0f172 not found: ID does not exist" containerID="b638f48380d218d0aea67953408e6f70defbdbdc28e42d57d30e83f7a0a0f172" Apr 17 09:16:01.699323 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:16:01.699265 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b638f48380d218d0aea67953408e6f70defbdbdc28e42d57d30e83f7a0a0f172"} err="failed to get container status \"b638f48380d218d0aea67953408e6f70defbdbdc28e42d57d30e83f7a0a0f172\": rpc error: code = NotFound desc = could not find container \"b638f48380d218d0aea67953408e6f70defbdbdc28e42d57d30e83f7a0a0f172\": container with ID starting with b638f48380d218d0aea67953408e6f70defbdbdc28e42d57d30e83f7a0a0f172 not found: ID does not exist" Apr 17 09:16:01.709417 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:16:01.709395 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6b8d56cc57-9m2lt"] Apr 17 09:16:01.713076 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:16:01.713055 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6b8d56cc57-9m2lt"] Apr 17 09:16:03.659312 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:16:03.659280 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e66c64fc-e28a-4446-9765-074be13471cf" path="/var/lib/kubelet/pods/e66c64fc-e28a-4446-9765-074be13471cf/volumes" Apr 17 09:16:15.568106 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:16:15.568085 2578 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 09:17:11.418486 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:11.418406 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-wc8rj"] Apr 17 09:17:11.418905 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:11.418703 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e66c64fc-e28a-4446-9765-074be13471cf" containerName="console" Apr 17 09:17:11.418905 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:11.418714 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="e66c64fc-e28a-4446-9765-074be13471cf" containerName="console" Apr 17 09:17:11.418905 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:11.418775 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="e66c64fc-e28a-4446-9765-074be13471cf" containerName="console" Apr 17 09:17:11.421686 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:11.421670 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wc8rj" Apr 17 09:17:11.424340 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:11.424322 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 09:17:11.429366 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:11.429341 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-wc8rj"] Apr 17 09:17:11.468161 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:11.468138 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/0f470e3b-3dff-4a2d-9794-fbd675ca2ae9-kubelet-config\") pod \"global-pull-secret-syncer-wc8rj\" (UID: \"0f470e3b-3dff-4a2d-9794-fbd675ca2ae9\") " pod="kube-system/global-pull-secret-syncer-wc8rj" Apr 17 09:17:11.468264 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:11.468174 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/0f470e3b-3dff-4a2d-9794-fbd675ca2ae9-dbus\") pod \"global-pull-secret-syncer-wc8rj\" (UID: \"0f470e3b-3dff-4a2d-9794-fbd675ca2ae9\") " pod="kube-system/global-pull-secret-syncer-wc8rj" Apr 17 09:17:11.468264 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:11.468228 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0f470e3b-3dff-4a2d-9794-fbd675ca2ae9-original-pull-secret\") pod \"global-pull-secret-syncer-wc8rj\" (UID: \"0f470e3b-3dff-4a2d-9794-fbd675ca2ae9\") " pod="kube-system/global-pull-secret-syncer-wc8rj" Apr 17 09:17:11.516844 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:11.516821 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-859c59bdc7-bx7tg"] Apr 17 09:17:11.519782 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:11.519766 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-859c59bdc7-bx7tg" Apr 17 09:17:11.527876 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:11.527854 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-859c59bdc7-bx7tg"] Apr 17 09:17:11.569084 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:11.569055 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/addbe683-fd42-4f4c-a595-2fa6f37d3250-console-oauth-config\") pod \"console-859c59bdc7-bx7tg\" (UID: \"addbe683-fd42-4f4c-a595-2fa6f37d3250\") " pod="openshift-console/console-859c59bdc7-bx7tg" Apr 17 09:17:11.569199 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:11.569088 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/addbe683-fd42-4f4c-a595-2fa6f37d3250-trusted-ca-bundle\") pod \"console-859c59bdc7-bx7tg\" (UID: \"addbe683-fd42-4f4c-a595-2fa6f37d3250\") " pod="openshift-console/console-859c59bdc7-bx7tg" Apr 17 09:17:11.569199 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:11.569110 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/0f470e3b-3dff-4a2d-9794-fbd675ca2ae9-kubelet-config\") pod \"global-pull-secret-syncer-wc8rj\" (UID: \"0f470e3b-3dff-4a2d-9794-fbd675ca2ae9\") " pod="kube-system/global-pull-secret-syncer-wc8rj" Apr 17 09:17:11.569199 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:11.569137 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/0f470e3b-3dff-4a2d-9794-fbd675ca2ae9-dbus\") pod \"global-pull-secret-syncer-wc8rj\" (UID: \"0f470e3b-3dff-4a2d-9794-fbd675ca2ae9\") " pod="kube-system/global-pull-secret-syncer-wc8rj" Apr 17 09:17:11.569199 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:11.569162 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/addbe683-fd42-4f4c-a595-2fa6f37d3250-oauth-serving-cert\") pod \"console-859c59bdc7-bx7tg\" (UID: \"addbe683-fd42-4f4c-a595-2fa6f37d3250\") " pod="openshift-console/console-859c59bdc7-bx7tg" Apr 17 09:17:11.569199 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:11.569184 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f7xl\" (UniqueName: \"kubernetes.io/projected/addbe683-fd42-4f4c-a595-2fa6f37d3250-kube-api-access-8f7xl\") pod \"console-859c59bdc7-bx7tg\" (UID: \"addbe683-fd42-4f4c-a595-2fa6f37d3250\") " pod="openshift-console/console-859c59bdc7-bx7tg" Apr 17 09:17:11.569419 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:11.569205 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/0f470e3b-3dff-4a2d-9794-fbd675ca2ae9-kubelet-config\") pod \"global-pull-secret-syncer-wc8rj\" (UID: \"0f470e3b-3dff-4a2d-9794-fbd675ca2ae9\") " pod="kube-system/global-pull-secret-syncer-wc8rj" Apr 17 09:17:11.569419 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:11.569214 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0f470e3b-3dff-4a2d-9794-fbd675ca2ae9-original-pull-secret\") pod \"global-pull-secret-syncer-wc8rj\" (UID: \"0f470e3b-3dff-4a2d-9794-fbd675ca2ae9\") " pod="kube-system/global-pull-secret-syncer-wc8rj" Apr 17 09:17:11.569419 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:11.569268 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/addbe683-fd42-4f4c-a595-2fa6f37d3250-console-serving-cert\") pod \"console-859c59bdc7-bx7tg\" (UID: \"addbe683-fd42-4f4c-a595-2fa6f37d3250\") " pod="openshift-console/console-859c59bdc7-bx7tg" Apr 17 09:17:11.569419 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:11.569277 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/0f470e3b-3dff-4a2d-9794-fbd675ca2ae9-dbus\") pod \"global-pull-secret-syncer-wc8rj\" (UID: \"0f470e3b-3dff-4a2d-9794-fbd675ca2ae9\") " pod="kube-system/global-pull-secret-syncer-wc8rj" Apr 17 09:17:11.569419 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:11.569294 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/addbe683-fd42-4f4c-a595-2fa6f37d3250-console-config\") pod \"console-859c59bdc7-bx7tg\" (UID: \"addbe683-fd42-4f4c-a595-2fa6f37d3250\") " pod="openshift-console/console-859c59bdc7-bx7tg" Apr 17 09:17:11.569419 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:11.569336 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/addbe683-fd42-4f4c-a595-2fa6f37d3250-service-ca\") pod \"console-859c59bdc7-bx7tg\" (UID: \"addbe683-fd42-4f4c-a595-2fa6f37d3250\") " pod="openshift-console/console-859c59bdc7-bx7tg" Apr 17 09:17:11.571576 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:11.571557 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0f470e3b-3dff-4a2d-9794-fbd675ca2ae9-original-pull-secret\") pod \"global-pull-secret-syncer-wc8rj\" (UID: \"0f470e3b-3dff-4a2d-9794-fbd675ca2ae9\") " pod="kube-system/global-pull-secret-syncer-wc8rj" Apr 17 09:17:11.669638 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:11.669586 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/addbe683-fd42-4f4c-a595-2fa6f37d3250-console-oauth-config\") pod \"console-859c59bdc7-bx7tg\" (UID: \"addbe683-fd42-4f4c-a595-2fa6f37d3250\") " pod="openshift-console/console-859c59bdc7-bx7tg" Apr 17 09:17:11.669638 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:11.669614 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/addbe683-fd42-4f4c-a595-2fa6f37d3250-trusted-ca-bundle\") pod \"console-859c59bdc7-bx7tg\" (UID: \"addbe683-fd42-4f4c-a595-2fa6f37d3250\") " pod="openshift-console/console-859c59bdc7-bx7tg" Apr 17 09:17:11.669638 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:11.669635 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/addbe683-fd42-4f4c-a595-2fa6f37d3250-oauth-serving-cert\") pod \"console-859c59bdc7-bx7tg\" (UID: \"addbe683-fd42-4f4c-a595-2fa6f37d3250\") " pod="openshift-console/console-859c59bdc7-bx7tg" Apr 17 09:17:11.669877 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:11.669657 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8f7xl\" (UniqueName: \"kubernetes.io/projected/addbe683-fd42-4f4c-a595-2fa6f37d3250-kube-api-access-8f7xl\") pod \"console-859c59bdc7-bx7tg\" (UID: \"addbe683-fd42-4f4c-a595-2fa6f37d3250\") " pod="openshift-console/console-859c59bdc7-bx7tg" Apr 17 09:17:11.669877 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:11.669696 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/addbe683-fd42-4f4c-a595-2fa6f37d3250-console-serving-cert\") pod \"console-859c59bdc7-bx7tg\" (UID: \"addbe683-fd42-4f4c-a595-2fa6f37d3250\") " pod="openshift-console/console-859c59bdc7-bx7tg" Apr 17 09:17:11.669877 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:11.669723 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/addbe683-fd42-4f4c-a595-2fa6f37d3250-console-config\") pod \"console-859c59bdc7-bx7tg\" (UID: \"addbe683-fd42-4f4c-a595-2fa6f37d3250\") " pod="openshift-console/console-859c59bdc7-bx7tg" Apr 17 09:17:11.669877 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:11.669759 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/addbe683-fd42-4f4c-a595-2fa6f37d3250-service-ca\") pod \"console-859c59bdc7-bx7tg\" (UID: \"addbe683-fd42-4f4c-a595-2fa6f37d3250\") " pod="openshift-console/console-859c59bdc7-bx7tg" Apr 17 09:17:11.670388 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:11.670369 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/addbe683-fd42-4f4c-a595-2fa6f37d3250-oauth-serving-cert\") pod \"console-859c59bdc7-bx7tg\" (UID: \"addbe683-fd42-4f4c-a595-2fa6f37d3250\") " pod="openshift-console/console-859c59bdc7-bx7tg" Apr 17 09:17:11.670524 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:11.670505 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/addbe683-fd42-4f4c-a595-2fa6f37d3250-console-config\") pod \"console-859c59bdc7-bx7tg\" (UID: \"addbe683-fd42-4f4c-a595-2fa6f37d3250\") " pod="openshift-console/console-859c59bdc7-bx7tg" Apr 17 09:17:11.670572 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:11.670532 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/addbe683-fd42-4f4c-a595-2fa6f37d3250-service-ca\") pod \"console-859c59bdc7-bx7tg\" (UID: \"addbe683-fd42-4f4c-a595-2fa6f37d3250\") " pod="openshift-console/console-859c59bdc7-bx7tg" Apr 17 09:17:11.670615 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:11.670579 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/addbe683-fd42-4f4c-a595-2fa6f37d3250-trusted-ca-bundle\") pod \"console-859c59bdc7-bx7tg\" (UID: \"addbe683-fd42-4f4c-a595-2fa6f37d3250\") " pod="openshift-console/console-859c59bdc7-bx7tg" Apr 17 09:17:11.672396 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:11.672377 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/addbe683-fd42-4f4c-a595-2fa6f37d3250-console-oauth-config\") pod \"console-859c59bdc7-bx7tg\" (UID: \"addbe683-fd42-4f4c-a595-2fa6f37d3250\") " pod="openshift-console/console-859c59bdc7-bx7tg" Apr 17 09:17:11.672517 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:11.672499 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/addbe683-fd42-4f4c-a595-2fa6f37d3250-console-serving-cert\") pod \"console-859c59bdc7-bx7tg\" (UID: \"addbe683-fd42-4f4c-a595-2fa6f37d3250\") " pod="openshift-console/console-859c59bdc7-bx7tg" Apr 17 09:17:11.680449 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:11.680421 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f7xl\" (UniqueName: \"kubernetes.io/projected/addbe683-fd42-4f4c-a595-2fa6f37d3250-kube-api-access-8f7xl\") pod \"console-859c59bdc7-bx7tg\" (UID: \"addbe683-fd42-4f4c-a595-2fa6f37d3250\") " pod="openshift-console/console-859c59bdc7-bx7tg" Apr 17 09:17:11.731111 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:11.731090 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wc8rj" Apr 17 09:17:11.828824 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:11.828785 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-859c59bdc7-bx7tg" Apr 17 09:17:11.845218 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:11.845192 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-wc8rj"] Apr 17 09:17:11.846327 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:17:11.846301 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f470e3b_3dff_4a2d_9794_fbd675ca2ae9.slice/crio-86d7b7de03d61b432acd6720b7b665bcb7237d8d0df53568103f1272e91c997e WatchSource:0}: Error finding container 86d7b7de03d61b432acd6720b7b665bcb7237d8d0df53568103f1272e91c997e: Status 404 returned error can't find the container with id 86d7b7de03d61b432acd6720b7b665bcb7237d8d0df53568103f1272e91c997e Apr 17 09:17:11.848147 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:11.848128 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 09:17:11.894712 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:11.894668 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-wc8rj" event={"ID":"0f470e3b-3dff-4a2d-9794-fbd675ca2ae9","Type":"ContainerStarted","Data":"86d7b7de03d61b432acd6720b7b665bcb7237d8d0df53568103f1272e91c997e"} Apr 17 09:17:11.945633 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:11.945609 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-859c59bdc7-bx7tg"] Apr 17 09:17:11.948050 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:17:11.948021 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaddbe683_fd42_4f4c_a595_2fa6f37d3250.slice/crio-fc5a7e5afac6009e80788827269fc6f838cbc8a2cb62a0e3ac095ebc7cca8b53 WatchSource:0}: Error finding container fc5a7e5afac6009e80788827269fc6f838cbc8a2cb62a0e3ac095ebc7cca8b53: Status 404 returned error can't find the container with id fc5a7e5afac6009e80788827269fc6f838cbc8a2cb62a0e3ac095ebc7cca8b53 Apr 17 09:17:12.899227 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:12.899189 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-859c59bdc7-bx7tg" event={"ID":"addbe683-fd42-4f4c-a595-2fa6f37d3250","Type":"ContainerStarted","Data":"2c1fb3d8ea4c447b5da49384b20e131002d8e4d0e6a3da04c507d2aa1ee7cc92"} Apr 17 09:17:12.899227 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:12.899232 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-859c59bdc7-bx7tg" event={"ID":"addbe683-fd42-4f4c-a595-2fa6f37d3250","Type":"ContainerStarted","Data":"fc5a7e5afac6009e80788827269fc6f838cbc8a2cb62a0e3ac095ebc7cca8b53"} Apr 17 09:17:12.921939 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:12.921877 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-859c59bdc7-bx7tg" podStartSLOduration=1.9218649270000001 podStartE2EDuration="1.921864927s" podCreationTimestamp="2026-04-17 09:17:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 09:17:12.920796178 +0000 UTC m=+357.822853571" watchObservedRunningTime="2026-04-17 09:17:12.921864927 +0000 UTC m=+357.823922308" Apr 17 09:17:16.916852 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:16.916794 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-wc8rj" event={"ID":"0f470e3b-3dff-4a2d-9794-fbd675ca2ae9","Type":"ContainerStarted","Data":"01c8d2f4103a33d51f3dfb4b5c9f1d8ae9afeb2ef78736186bdeba613aa2db4b"} Apr 17 09:17:16.933639 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:16.933592 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-wc8rj" podStartSLOduration=1.806216721 podStartE2EDuration="5.933579114s" podCreationTimestamp="2026-04-17 09:17:11 +0000 UTC" firstStartedPulling="2026-04-17 09:17:11.84825172 +0000 UTC m=+356.750309083" lastFinishedPulling="2026-04-17 09:17:15.975614106 +0000 UTC m=+360.877671476" observedRunningTime="2026-04-17 09:17:16.931600678 +0000 UTC m=+361.833658060" watchObservedRunningTime="2026-04-17 09:17:16.933579114 +0000 UTC m=+361.835636495" Apr 17 09:17:21.828999 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:21.828959 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-859c59bdc7-bx7tg" Apr 17 09:17:21.828999 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:21.829000 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-859c59bdc7-bx7tg" Apr 17 09:17:21.833823 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:21.833801 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-859c59bdc7-bx7tg" Apr 17 09:17:21.936985 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:21.936951 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-859c59bdc7-bx7tg" Apr 17 09:17:21.983815 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:21.983775 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-54559dc8b-zwnqp"] Apr 17 09:17:40.015599 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:40.015567 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fw69h"] Apr 17 09:17:40.017880 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:40.017865 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fw69h" Apr 17 09:17:40.020431 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:40.020406 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 09:17:40.020566 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:40.020437 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-twc4k\"" Apr 17 09:17:40.021619 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:40.021601 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 09:17:40.028160 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:40.028137 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fw69h"] Apr 17 09:17:40.100174 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:40.100143 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3eefd0ce-ae8d-4541-b846-ee63299f6f8b-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fw69h\" (UID: \"3eefd0ce-ae8d-4541-b846-ee63299f6f8b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fw69h" Apr 17 09:17:40.100343 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:40.100179 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmgxf\" (UniqueName: \"kubernetes.io/projected/3eefd0ce-ae8d-4541-b846-ee63299f6f8b-kube-api-access-lmgxf\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fw69h\" (UID: \"3eefd0ce-ae8d-4541-b846-ee63299f6f8b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fw69h" Apr 17 09:17:40.100343 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:40.100318 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3eefd0ce-ae8d-4541-b846-ee63299f6f8b-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fw69h\" (UID: \"3eefd0ce-ae8d-4541-b846-ee63299f6f8b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fw69h" Apr 17 09:17:40.200925 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:40.200894 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3eefd0ce-ae8d-4541-b846-ee63299f6f8b-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fw69h\" (UID: \"3eefd0ce-ae8d-4541-b846-ee63299f6f8b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fw69h" Apr 17 09:17:40.201062 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:40.200932 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lmgxf\" (UniqueName: \"kubernetes.io/projected/3eefd0ce-ae8d-4541-b846-ee63299f6f8b-kube-api-access-lmgxf\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fw69h\" (UID: \"3eefd0ce-ae8d-4541-b846-ee63299f6f8b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fw69h" Apr 17 09:17:40.201062 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:40.200978 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3eefd0ce-ae8d-4541-b846-ee63299f6f8b-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fw69h\" (UID: \"3eefd0ce-ae8d-4541-b846-ee63299f6f8b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fw69h" Apr 17 09:17:40.201344 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:40.201321 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3eefd0ce-ae8d-4541-b846-ee63299f6f8b-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fw69h\" (UID: \"3eefd0ce-ae8d-4541-b846-ee63299f6f8b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fw69h" Apr 17 09:17:40.201404 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:40.201345 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3eefd0ce-ae8d-4541-b846-ee63299f6f8b-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fw69h\" (UID: \"3eefd0ce-ae8d-4541-b846-ee63299f6f8b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fw69h" Apr 17 09:17:40.209591 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:40.209568 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmgxf\" (UniqueName: \"kubernetes.io/projected/3eefd0ce-ae8d-4541-b846-ee63299f6f8b-kube-api-access-lmgxf\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fw69h\" (UID: \"3eefd0ce-ae8d-4541-b846-ee63299f6f8b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fw69h" Apr 17 09:17:40.328117 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:40.328050 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fw69h" Apr 17 09:17:40.454335 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:40.454227 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fw69h"] Apr 17 09:17:40.457906 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:17:40.457882 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3eefd0ce_ae8d_4541_b846_ee63299f6f8b.slice/crio-245afb1cb28b55a4e2a9ec99ba27aff99113fa0b9d1b8a92364774a44a24357d WatchSource:0}: Error finding container 245afb1cb28b55a4e2a9ec99ba27aff99113fa0b9d1b8a92364774a44a24357d: Status 404 returned error can't find the container with id 245afb1cb28b55a4e2a9ec99ba27aff99113fa0b9d1b8a92364774a44a24357d Apr 17 09:17:40.986647 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:40.986605 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fw69h" event={"ID":"3eefd0ce-ae8d-4541-b846-ee63299f6f8b","Type":"ContainerStarted","Data":"245afb1cb28b55a4e2a9ec99ba27aff99113fa0b9d1b8a92364774a44a24357d"} Apr 17 09:17:47.003660 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:47.003625 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-54559dc8b-zwnqp" podUID="88b32e60-0575-404c-a7ee-9b025ad967da" containerName="console" containerID="cri-o://232c6d13389d6d1c222fc104e7a03fafb2d84cabccf90391a0df4441bb4ed00e" gracePeriod=15 Apr 17 09:17:47.007321 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:47.007297 2578 generic.go:358] "Generic (PLEG): container finished" podID="3eefd0ce-ae8d-4541-b846-ee63299f6f8b" containerID="5de24f0e1d73a743bc2e098f1ba448bcdb1742e42d188b3f9c27f4db863bac5f" exitCode=0 Apr 17 09:17:47.007414 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:47.007336 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fw69h" event={"ID":"3eefd0ce-ae8d-4541-b846-ee63299f6f8b","Type":"ContainerDied","Data":"5de24f0e1d73a743bc2e098f1ba448bcdb1742e42d188b3f9c27f4db863bac5f"} Apr 17 09:17:47.239614 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:47.239594 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-54559dc8b-zwnqp_88b32e60-0575-404c-a7ee-9b025ad967da/console/0.log" Apr 17 09:17:47.239717 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:47.239651 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54559dc8b-zwnqp" Apr 17 09:17:47.365307 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:47.365267 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vc5h\" (UniqueName: \"kubernetes.io/projected/88b32e60-0575-404c-a7ee-9b025ad967da-kube-api-access-6vc5h\") pod \"88b32e60-0575-404c-a7ee-9b025ad967da\" (UID: \"88b32e60-0575-404c-a7ee-9b025ad967da\") " Apr 17 09:17:47.365307 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:47.365312 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/88b32e60-0575-404c-a7ee-9b025ad967da-console-oauth-config\") pod \"88b32e60-0575-404c-a7ee-9b025ad967da\" (UID: \"88b32e60-0575-404c-a7ee-9b025ad967da\") " Apr 17 09:17:47.365540 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:47.365346 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/88b32e60-0575-404c-a7ee-9b025ad967da-oauth-serving-cert\") pod \"88b32e60-0575-404c-a7ee-9b025ad967da\" (UID: \"88b32e60-0575-404c-a7ee-9b025ad967da\") " Apr 17 09:17:47.365540 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:47.365362 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/88b32e60-0575-404c-a7ee-9b025ad967da-console-config\") pod \"88b32e60-0575-404c-a7ee-9b025ad967da\" (UID: \"88b32e60-0575-404c-a7ee-9b025ad967da\") " Apr 17 09:17:47.365540 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:47.365391 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/88b32e60-0575-404c-a7ee-9b025ad967da-console-serving-cert\") pod \"88b32e60-0575-404c-a7ee-9b025ad967da\" (UID: \"88b32e60-0575-404c-a7ee-9b025ad967da\") " Apr 17 09:17:47.365540 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:47.365433 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/88b32e60-0575-404c-a7ee-9b025ad967da-service-ca\") pod \"88b32e60-0575-404c-a7ee-9b025ad967da\" (UID: \"88b32e60-0575-404c-a7ee-9b025ad967da\") " Apr 17 09:17:47.365540 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:47.365466 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88b32e60-0575-404c-a7ee-9b025ad967da-trusted-ca-bundle\") pod \"88b32e60-0575-404c-a7ee-9b025ad967da\" (UID: \"88b32e60-0575-404c-a7ee-9b025ad967da\") " Apr 17 09:17:47.365880 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:47.365828 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88b32e60-0575-404c-a7ee-9b025ad967da-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "88b32e60-0575-404c-a7ee-9b025ad967da" (UID: "88b32e60-0575-404c-a7ee-9b025ad967da"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 09:17:47.365880 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:47.365863 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88b32e60-0575-404c-a7ee-9b025ad967da-service-ca" (OuterVolumeSpecName: "service-ca") pod "88b32e60-0575-404c-a7ee-9b025ad967da" (UID: "88b32e60-0575-404c-a7ee-9b025ad967da"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 09:17:47.365880 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:47.365821 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88b32e60-0575-404c-a7ee-9b025ad967da-console-config" (OuterVolumeSpecName: "console-config") pod "88b32e60-0575-404c-a7ee-9b025ad967da" (UID: "88b32e60-0575-404c-a7ee-9b025ad967da"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 09:17:47.366062 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:47.365951 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88b32e60-0575-404c-a7ee-9b025ad967da-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "88b32e60-0575-404c-a7ee-9b025ad967da" (UID: "88b32e60-0575-404c-a7ee-9b025ad967da"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 09:17:47.367613 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:47.367589 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88b32e60-0575-404c-a7ee-9b025ad967da-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "88b32e60-0575-404c-a7ee-9b025ad967da" (UID: "88b32e60-0575-404c-a7ee-9b025ad967da"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 09:17:47.367685 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:47.367636 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88b32e60-0575-404c-a7ee-9b025ad967da-kube-api-access-6vc5h" (OuterVolumeSpecName: "kube-api-access-6vc5h") pod "88b32e60-0575-404c-a7ee-9b025ad967da" (UID: "88b32e60-0575-404c-a7ee-9b025ad967da"). InnerVolumeSpecName "kube-api-access-6vc5h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 09:17:47.367685 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:47.367648 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88b32e60-0575-404c-a7ee-9b025ad967da-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "88b32e60-0575-404c-a7ee-9b025ad967da" (UID: "88b32e60-0575-404c-a7ee-9b025ad967da"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 09:17:47.466778 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:47.466723 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6vc5h\" (UniqueName: \"kubernetes.io/projected/88b32e60-0575-404c-a7ee-9b025ad967da-kube-api-access-6vc5h\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:17:47.466778 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:47.466744 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/88b32e60-0575-404c-a7ee-9b025ad967da-console-oauth-config\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:17:47.466778 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:47.466754 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/88b32e60-0575-404c-a7ee-9b025ad967da-oauth-serving-cert\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:17:47.466778 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:47.466764 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/88b32e60-0575-404c-a7ee-9b025ad967da-console-config\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:17:47.466778 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:47.466773 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/88b32e60-0575-404c-a7ee-9b025ad967da-console-serving-cert\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:17:47.467010 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:47.466783 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/88b32e60-0575-404c-a7ee-9b025ad967da-service-ca\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:17:47.467010 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:47.466792 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88b32e60-0575-404c-a7ee-9b025ad967da-trusted-ca-bundle\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:17:48.011831 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:48.011802 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-54559dc8b-zwnqp_88b32e60-0575-404c-a7ee-9b025ad967da/console/0.log" Apr 17 09:17:48.012208 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:48.011873 2578 generic.go:358] "Generic (PLEG): container finished" podID="88b32e60-0575-404c-a7ee-9b025ad967da" containerID="232c6d13389d6d1c222fc104e7a03fafb2d84cabccf90391a0df4441bb4ed00e" exitCode=2 Apr 17 09:17:48.012208 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:48.011931 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54559dc8b-zwnqp" event={"ID":"88b32e60-0575-404c-a7ee-9b025ad967da","Type":"ContainerDied","Data":"232c6d13389d6d1c222fc104e7a03fafb2d84cabccf90391a0df4441bb4ed00e"} Apr 17 09:17:48.012208 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:48.011958 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54559dc8b-zwnqp" event={"ID":"88b32e60-0575-404c-a7ee-9b025ad967da","Type":"ContainerDied","Data":"6b9343e802bf8f1cfa1d6b4ab4bcaab0b55c6dbaa8d380f62619b91fe1cc0da2"} Apr 17 09:17:48.012208 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:48.011973 2578 scope.go:117] "RemoveContainer" containerID="232c6d13389d6d1c222fc104e7a03fafb2d84cabccf90391a0df4441bb4ed00e" Apr 17 09:17:48.012208 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:48.011984 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54559dc8b-zwnqp" Apr 17 09:17:48.019846 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:48.019818 2578 scope.go:117] "RemoveContainer" containerID="232c6d13389d6d1c222fc104e7a03fafb2d84cabccf90391a0df4441bb4ed00e" Apr 17 09:17:48.020077 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:17:48.020059 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"232c6d13389d6d1c222fc104e7a03fafb2d84cabccf90391a0df4441bb4ed00e\": container with ID starting with 232c6d13389d6d1c222fc104e7a03fafb2d84cabccf90391a0df4441bb4ed00e not found: ID does not exist" containerID="232c6d13389d6d1c222fc104e7a03fafb2d84cabccf90391a0df4441bb4ed00e" Apr 17 09:17:48.020129 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:48.020084 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"232c6d13389d6d1c222fc104e7a03fafb2d84cabccf90391a0df4441bb4ed00e"} err="failed to get container status \"232c6d13389d6d1c222fc104e7a03fafb2d84cabccf90391a0df4441bb4ed00e\": rpc error: code = NotFound desc = could not find container \"232c6d13389d6d1c222fc104e7a03fafb2d84cabccf90391a0df4441bb4ed00e\": container with ID starting with 232c6d13389d6d1c222fc104e7a03fafb2d84cabccf90391a0df4441bb4ed00e not found: ID does not exist" Apr 17 09:17:48.031551 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:48.031526 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-54559dc8b-zwnqp"] Apr 17 09:17:48.036829 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:48.036811 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-54559dc8b-zwnqp"] Apr 17 09:17:49.016756 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:49.016720 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fw69h" event={"ID":"3eefd0ce-ae8d-4541-b846-ee63299f6f8b","Type":"ContainerStarted","Data":"d3ff7647c23da668bcc7df0e5217edcbbe2c0c3ef3a43e41b130c424cfca428a"} Apr 17 09:17:49.659574 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:49.659541 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88b32e60-0575-404c-a7ee-9b025ad967da" path="/var/lib/kubelet/pods/88b32e60-0575-404c-a7ee-9b025ad967da/volumes" Apr 17 09:17:50.022182 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:50.022149 2578 generic.go:358] "Generic (PLEG): container finished" podID="3eefd0ce-ae8d-4541-b846-ee63299f6f8b" containerID="d3ff7647c23da668bcc7df0e5217edcbbe2c0c3ef3a43e41b130c424cfca428a" exitCode=0 Apr 17 09:17:50.022558 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:50.022200 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fw69h" event={"ID":"3eefd0ce-ae8d-4541-b846-ee63299f6f8b","Type":"ContainerDied","Data":"d3ff7647c23da668bcc7df0e5217edcbbe2c0c3ef3a43e41b130c424cfca428a"} Apr 17 09:17:59.054084 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:59.054047 2578 generic.go:358] "Generic (PLEG): container finished" podID="3eefd0ce-ae8d-4541-b846-ee63299f6f8b" containerID="bbfc6cd8f990abf25ccc5b30049bd105699dfc6dae819f443d6f5002d90c4ec8" exitCode=0 Apr 17 09:17:59.054463 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:17:59.054113 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fw69h" event={"ID":"3eefd0ce-ae8d-4541-b846-ee63299f6f8b","Type":"ContainerDied","Data":"bbfc6cd8f990abf25ccc5b30049bd105699dfc6dae819f443d6f5002d90c4ec8"} Apr 17 09:18:00.173948 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:00.173927 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fw69h" Apr 17 09:18:00.271967 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:00.271936 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmgxf\" (UniqueName: \"kubernetes.io/projected/3eefd0ce-ae8d-4541-b846-ee63299f6f8b-kube-api-access-lmgxf\") pod \"3eefd0ce-ae8d-4541-b846-ee63299f6f8b\" (UID: \"3eefd0ce-ae8d-4541-b846-ee63299f6f8b\") " Apr 17 09:18:00.272138 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:00.271989 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3eefd0ce-ae8d-4541-b846-ee63299f6f8b-bundle\") pod \"3eefd0ce-ae8d-4541-b846-ee63299f6f8b\" (UID: \"3eefd0ce-ae8d-4541-b846-ee63299f6f8b\") " Apr 17 09:18:00.272138 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:00.272010 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3eefd0ce-ae8d-4541-b846-ee63299f6f8b-util\") pod \"3eefd0ce-ae8d-4541-b846-ee63299f6f8b\" (UID: \"3eefd0ce-ae8d-4541-b846-ee63299f6f8b\") " Apr 17 09:18:00.272531 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:00.272508 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3eefd0ce-ae8d-4541-b846-ee63299f6f8b-bundle" (OuterVolumeSpecName: "bundle") pod "3eefd0ce-ae8d-4541-b846-ee63299f6f8b" (UID: "3eefd0ce-ae8d-4541-b846-ee63299f6f8b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 09:18:00.274132 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:00.274111 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3eefd0ce-ae8d-4541-b846-ee63299f6f8b-kube-api-access-lmgxf" (OuterVolumeSpecName: "kube-api-access-lmgxf") pod "3eefd0ce-ae8d-4541-b846-ee63299f6f8b" (UID: "3eefd0ce-ae8d-4541-b846-ee63299f6f8b"). InnerVolumeSpecName "kube-api-access-lmgxf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 09:18:00.277754 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:00.277728 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3eefd0ce-ae8d-4541-b846-ee63299f6f8b-util" (OuterVolumeSpecName: "util") pod "3eefd0ce-ae8d-4541-b846-ee63299f6f8b" (UID: "3eefd0ce-ae8d-4541-b846-ee63299f6f8b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 09:18:00.373224 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:00.373170 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lmgxf\" (UniqueName: \"kubernetes.io/projected/3eefd0ce-ae8d-4541-b846-ee63299f6f8b-kube-api-access-lmgxf\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:18:00.373224 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:00.373190 2578 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3eefd0ce-ae8d-4541-b846-ee63299f6f8b-bundle\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:18:00.373224 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:00.373200 2578 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3eefd0ce-ae8d-4541-b846-ee63299f6f8b-util\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:18:01.060777 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:01.060737 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fw69h" event={"ID":"3eefd0ce-ae8d-4541-b846-ee63299f6f8b","Type":"ContainerDied","Data":"245afb1cb28b55a4e2a9ec99ba27aff99113fa0b9d1b8a92364774a44a24357d"} Apr 17 09:18:01.060777 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:01.060769 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="245afb1cb28b55a4e2a9ec99ba27aff99113fa0b9d1b8a92364774a44a24357d" Apr 17 09:18:01.060979 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:01.060783 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5fw69h" Apr 17 09:18:07.128969 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:07.128938 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-tn9j9"] Apr 17 09:18:07.129322 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:07.129240 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88b32e60-0575-404c-a7ee-9b025ad967da" containerName="console" Apr 17 09:18:07.129322 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:07.129250 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="88b32e60-0575-404c-a7ee-9b025ad967da" containerName="console" Apr 17 09:18:07.129322 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:07.129261 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3eefd0ce-ae8d-4541-b846-ee63299f6f8b" containerName="pull" Apr 17 09:18:07.129322 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:07.129269 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eefd0ce-ae8d-4541-b846-ee63299f6f8b" containerName="pull" Apr 17 09:18:07.129322 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:07.129278 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3eefd0ce-ae8d-4541-b846-ee63299f6f8b" containerName="extract" Apr 17 09:18:07.129322 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:07.129283 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eefd0ce-ae8d-4541-b846-ee63299f6f8b" containerName="extract" Apr 17 09:18:07.129322 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:07.129299 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3eefd0ce-ae8d-4541-b846-ee63299f6f8b" containerName="util" Apr 17 09:18:07.129322 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:07.129304 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eefd0ce-ae8d-4541-b846-ee63299f6f8b" containerName="util" Apr 17 09:18:07.129546 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:07.129354 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="88b32e60-0575-404c-a7ee-9b025ad967da" containerName="console" Apr 17 09:18:07.129546 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:07.129362 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="3eefd0ce-ae8d-4541-b846-ee63299f6f8b" containerName="extract" Apr 17 09:18:07.132756 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:07.132742 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-tn9j9" Apr 17 09:18:07.135523 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:07.135497 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 17 09:18:07.135645 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:07.135541 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-ql52z\"" Apr 17 09:18:07.135645 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:07.135589 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 17 09:18:07.142347 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:07.142325 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-tn9j9"] Apr 17 09:18:07.227002 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:07.226968 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2a0fb570-0236-4cbd-b84d-80f4bd05278a-tmp\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-tn9j9\" (UID: \"2a0fb570-0236-4cbd-b84d-80f4bd05278a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-tn9j9" Apr 17 09:18:07.227159 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:07.227047 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb85q\" (UniqueName: \"kubernetes.io/projected/2a0fb570-0236-4cbd-b84d-80f4bd05278a-kube-api-access-hb85q\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-tn9j9\" (UID: \"2a0fb570-0236-4cbd-b84d-80f4bd05278a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-tn9j9" Apr 17 09:18:07.327629 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:07.327594 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2a0fb570-0236-4cbd-b84d-80f4bd05278a-tmp\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-tn9j9\" (UID: \"2a0fb570-0236-4cbd-b84d-80f4bd05278a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-tn9j9" Apr 17 09:18:07.327785 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:07.327659 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hb85q\" (UniqueName: \"kubernetes.io/projected/2a0fb570-0236-4cbd-b84d-80f4bd05278a-kube-api-access-hb85q\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-tn9j9\" (UID: \"2a0fb570-0236-4cbd-b84d-80f4bd05278a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-tn9j9" Apr 17 09:18:07.328028 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:07.328003 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2a0fb570-0236-4cbd-b84d-80f4bd05278a-tmp\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-tn9j9\" (UID: \"2a0fb570-0236-4cbd-b84d-80f4bd05278a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-tn9j9" Apr 17 09:18:07.342091 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:07.342061 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb85q\" (UniqueName: \"kubernetes.io/projected/2a0fb570-0236-4cbd-b84d-80f4bd05278a-kube-api-access-hb85q\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-tn9j9\" (UID: \"2a0fb570-0236-4cbd-b84d-80f4bd05278a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-tn9j9" Apr 17 09:18:07.442709 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:07.442637 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-tn9j9" Apr 17 09:18:07.573588 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:07.573563 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-tn9j9"] Apr 17 09:18:07.576511 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:18:07.576482 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a0fb570_0236_4cbd_b84d_80f4bd05278a.slice/crio-420f762550e37d7d82b493a9fbdcee2aabfcc445ff5abbef826159cfd5a618f6 WatchSource:0}: Error finding container 420f762550e37d7d82b493a9fbdcee2aabfcc445ff5abbef826159cfd5a618f6: Status 404 returned error can't find the container with id 420f762550e37d7d82b493a9fbdcee2aabfcc445ff5abbef826159cfd5a618f6 Apr 17 09:18:08.083085 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:08.083049 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-tn9j9" event={"ID":"2a0fb570-0236-4cbd-b84d-80f4bd05278a","Type":"ContainerStarted","Data":"420f762550e37d7d82b493a9fbdcee2aabfcc445ff5abbef826159cfd5a618f6"} Apr 17 09:18:10.090970 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:10.090934 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-tn9j9" event={"ID":"2a0fb570-0236-4cbd-b84d-80f4bd05278a","Type":"ContainerStarted","Data":"0a694d742aef7c075be4123b82165bb874a228fb4b997881e8d77eacc7436ebd"} Apr 17 09:18:10.120398 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:10.117675 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-tn9j9" podStartSLOduration=1.38010752 podStartE2EDuration="3.1176603s" podCreationTimestamp="2026-04-17 09:18:07 +0000 UTC" firstStartedPulling="2026-04-17 09:18:07.578987096 +0000 UTC m=+412.481044459" lastFinishedPulling="2026-04-17 09:18:09.316539867 +0000 UTC m=+414.218597239" observedRunningTime="2026-04-17 09:18:10.116097727 +0000 UTC m=+415.018155108" watchObservedRunningTime="2026-04-17 09:18:10.1176603 +0000 UTC m=+415.019717684" Apr 17 09:18:12.222261 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:12.222229 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-2vqnj"] Apr 17 09:18:12.225636 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:12.225619 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-2vqnj" Apr 17 09:18:12.228257 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:12.228230 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 17 09:18:12.228381 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:12.228300 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-86wr8\"" Apr 17 09:18:12.228381 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:12.228328 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 17 09:18:12.235746 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:12.235725 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-2vqnj"] Apr 17 09:18:12.269766 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:12.269740 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9djvj\" (UniqueName: \"kubernetes.io/projected/25afe25a-7471-4f4f-af37-5b04b49cad65-kube-api-access-9djvj\") pod \"cert-manager-webhook-597b96b99b-2vqnj\" (UID: \"25afe25a-7471-4f4f-af37-5b04b49cad65\") " pod="cert-manager/cert-manager-webhook-597b96b99b-2vqnj" Apr 17 09:18:12.269913 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:12.269783 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/25afe25a-7471-4f4f-af37-5b04b49cad65-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-2vqnj\" (UID: \"25afe25a-7471-4f4f-af37-5b04b49cad65\") " pod="cert-manager/cert-manager-webhook-597b96b99b-2vqnj" Apr 17 09:18:12.371054 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:12.371030 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9djvj\" (UniqueName: \"kubernetes.io/projected/25afe25a-7471-4f4f-af37-5b04b49cad65-kube-api-access-9djvj\") pod \"cert-manager-webhook-597b96b99b-2vqnj\" (UID: \"25afe25a-7471-4f4f-af37-5b04b49cad65\") " pod="cert-manager/cert-manager-webhook-597b96b99b-2vqnj" Apr 17 09:18:12.371156 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:12.371068 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/25afe25a-7471-4f4f-af37-5b04b49cad65-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-2vqnj\" (UID: \"25afe25a-7471-4f4f-af37-5b04b49cad65\") " pod="cert-manager/cert-manager-webhook-597b96b99b-2vqnj" Apr 17 09:18:12.379708 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:12.379683 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/25afe25a-7471-4f4f-af37-5b04b49cad65-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-2vqnj\" (UID: \"25afe25a-7471-4f4f-af37-5b04b49cad65\") " pod="cert-manager/cert-manager-webhook-597b96b99b-2vqnj" Apr 17 09:18:12.380187 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:12.380167 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9djvj\" (UniqueName: \"kubernetes.io/projected/25afe25a-7471-4f4f-af37-5b04b49cad65-kube-api-access-9djvj\") pod \"cert-manager-webhook-597b96b99b-2vqnj\" (UID: \"25afe25a-7471-4f4f-af37-5b04b49cad65\") " pod="cert-manager/cert-manager-webhook-597b96b99b-2vqnj" Apr 17 09:18:12.551735 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:12.551659 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-2vqnj" Apr 17 09:18:12.666395 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:12.666368 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-2vqnj"] Apr 17 09:18:12.669712 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:18:12.669683 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25afe25a_7471_4f4f_af37_5b04b49cad65.slice/crio-47226cf3644b0ecfd7aad67669df9cc1488dcda020109e60dde09426165910b5 WatchSource:0}: Error finding container 47226cf3644b0ecfd7aad67669df9cc1488dcda020109e60dde09426165910b5: Status 404 returned error can't find the container with id 47226cf3644b0ecfd7aad67669df9cc1488dcda020109e60dde09426165910b5 Apr 17 09:18:13.101792 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:13.101759 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-2vqnj" event={"ID":"25afe25a-7471-4f4f-af37-5b04b49cad65","Type":"ContainerStarted","Data":"47226cf3644b0ecfd7aad67669df9cc1488dcda020109e60dde09426165910b5"} Apr 17 09:18:15.869577 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:15.869485 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-bln28"] Apr 17 09:18:15.873435 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:15.873413 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-bln28" Apr 17 09:18:15.876123 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:15.876085 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-bdmsd\"" Apr 17 09:18:15.880658 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:15.880632 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-bln28"] Apr 17 09:18:16.003804 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:16.003762 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a25cfb99-63bc-43e2-be53-ea7eeb15ba27-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-bln28\" (UID: \"a25cfb99-63bc-43e2-be53-ea7eeb15ba27\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-bln28" Apr 17 09:18:16.004053 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:16.003933 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnkdr\" (UniqueName: \"kubernetes.io/projected/a25cfb99-63bc-43e2-be53-ea7eeb15ba27-kube-api-access-jnkdr\") pod \"cert-manager-cainjector-8966b78d4-bln28\" (UID: \"a25cfb99-63bc-43e2-be53-ea7eeb15ba27\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-bln28" Apr 17 09:18:16.104512 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:16.104471 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a25cfb99-63bc-43e2-be53-ea7eeb15ba27-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-bln28\" (UID: \"a25cfb99-63bc-43e2-be53-ea7eeb15ba27\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-bln28" Apr 17 09:18:16.104663 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:16.104549 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jnkdr\" (UniqueName: \"kubernetes.io/projected/a25cfb99-63bc-43e2-be53-ea7eeb15ba27-kube-api-access-jnkdr\") pod \"cert-manager-cainjector-8966b78d4-bln28\" (UID: \"a25cfb99-63bc-43e2-be53-ea7eeb15ba27\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-bln28" Apr 17 09:18:16.114199 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:16.114156 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a25cfb99-63bc-43e2-be53-ea7eeb15ba27-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-bln28\" (UID: \"a25cfb99-63bc-43e2-be53-ea7eeb15ba27\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-bln28" Apr 17 09:18:16.114508 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:16.114481 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnkdr\" (UniqueName: \"kubernetes.io/projected/a25cfb99-63bc-43e2-be53-ea7eeb15ba27-kube-api-access-jnkdr\") pod \"cert-manager-cainjector-8966b78d4-bln28\" (UID: \"a25cfb99-63bc-43e2-be53-ea7eeb15ba27\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-bln28" Apr 17 09:18:16.184551 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:16.184525 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-bln28" Apr 17 09:18:16.316612 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:16.316585 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-bln28"] Apr 17 09:18:16.318485 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:18:16.318457 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda25cfb99_63bc_43e2_be53_ea7eeb15ba27.slice/crio-8a368772490b36c87a0b506a60ba38f4869f05e28f2905abe0410f6dfb1f4484 WatchSource:0}: Error finding container 8a368772490b36c87a0b506a60ba38f4869f05e28f2905abe0410f6dfb1f4484: Status 404 returned error can't find the container with id 8a368772490b36c87a0b506a60ba38f4869f05e28f2905abe0410f6dfb1f4484 Apr 17 09:18:17.116540 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:17.116504 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-bln28" event={"ID":"a25cfb99-63bc-43e2-be53-ea7eeb15ba27","Type":"ContainerStarted","Data":"42e7a75a454503b2be0d21768396546785a3a0370fb7da8fbc201e389c7eccfb"} Apr 17 09:18:17.116985 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:17.116552 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-bln28" event={"ID":"a25cfb99-63bc-43e2-be53-ea7eeb15ba27","Type":"ContainerStarted","Data":"8a368772490b36c87a0b506a60ba38f4869f05e28f2905abe0410f6dfb1f4484"} Apr 17 09:18:17.117992 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:17.117960 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-2vqnj" event={"ID":"25afe25a-7471-4f4f-af37-5b04b49cad65","Type":"ContainerStarted","Data":"3926ee6b08050085c2194cb55efa85077860f61c23a784ece27feb17e802d867"} Apr 17 09:18:17.118109 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:17.118077 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-2vqnj" Apr 17 09:18:17.136402 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:17.136351 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-bln28" podStartSLOduration=2.136339716 podStartE2EDuration="2.136339716s" podCreationTimestamp="2026-04-17 09:18:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 09:18:17.133938396 +0000 UTC m=+422.035995776" watchObservedRunningTime="2026-04-17 09:18:17.136339716 +0000 UTC m=+422.038397097" Apr 17 09:18:17.156060 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:17.155992 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-2vqnj" podStartSLOduration=1.713226949 podStartE2EDuration="5.155978018s" podCreationTimestamp="2026-04-17 09:18:12 +0000 UTC" firstStartedPulling="2026-04-17 09:18:12.671431738 +0000 UTC m=+417.573489101" lastFinishedPulling="2026-04-17 09:18:16.114182806 +0000 UTC m=+421.016240170" observedRunningTime="2026-04-17 09:18:17.154561006 +0000 UTC m=+422.056618387" watchObservedRunningTime="2026-04-17 09:18:17.155978018 +0000 UTC m=+422.058035399" Apr 17 09:18:23.123971 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:23.123943 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-2vqnj" Apr 17 09:18:25.183043 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:25.183012 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-49gpd"] Apr 17 09:18:25.186444 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:25.186429 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-49gpd" Apr 17 09:18:25.188960 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:25.188941 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-mh7f2\"" Apr 17 09:18:25.192923 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:25.192902 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-49gpd"] Apr 17 09:18:25.282613 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:25.282583 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b685fde2-86c3-4314-b50e-ec33a3e085bb-bound-sa-token\") pod \"cert-manager-759f64656b-49gpd\" (UID: \"b685fde2-86c3-4314-b50e-ec33a3e085bb\") " pod="cert-manager/cert-manager-759f64656b-49gpd" Apr 17 09:18:25.282779 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:25.282653 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6mzh\" (UniqueName: \"kubernetes.io/projected/b685fde2-86c3-4314-b50e-ec33a3e085bb-kube-api-access-v6mzh\") pod \"cert-manager-759f64656b-49gpd\" (UID: \"b685fde2-86c3-4314-b50e-ec33a3e085bb\") " pod="cert-manager/cert-manager-759f64656b-49gpd" Apr 17 09:18:25.383631 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:25.383596 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v6mzh\" (UniqueName: \"kubernetes.io/projected/b685fde2-86c3-4314-b50e-ec33a3e085bb-kube-api-access-v6mzh\") pod \"cert-manager-759f64656b-49gpd\" (UID: \"b685fde2-86c3-4314-b50e-ec33a3e085bb\") " pod="cert-manager/cert-manager-759f64656b-49gpd" Apr 17 09:18:25.383796 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:25.383648 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b685fde2-86c3-4314-b50e-ec33a3e085bb-bound-sa-token\") pod \"cert-manager-759f64656b-49gpd\" (UID: \"b685fde2-86c3-4314-b50e-ec33a3e085bb\") " pod="cert-manager/cert-manager-759f64656b-49gpd" Apr 17 09:18:25.394386 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:25.394350 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b685fde2-86c3-4314-b50e-ec33a3e085bb-bound-sa-token\") pod \"cert-manager-759f64656b-49gpd\" (UID: \"b685fde2-86c3-4314-b50e-ec33a3e085bb\") " pod="cert-manager/cert-manager-759f64656b-49gpd" Apr 17 09:18:25.394491 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:25.394403 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6mzh\" (UniqueName: \"kubernetes.io/projected/b685fde2-86c3-4314-b50e-ec33a3e085bb-kube-api-access-v6mzh\") pod \"cert-manager-759f64656b-49gpd\" (UID: \"b685fde2-86c3-4314-b50e-ec33a3e085bb\") " pod="cert-manager/cert-manager-759f64656b-49gpd" Apr 17 09:18:25.496867 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:25.496822 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-49gpd" Apr 17 09:18:25.611717 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:25.611692 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-49gpd"] Apr 17 09:18:25.614187 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:18:25.614159 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb685fde2_86c3_4314_b50e_ec33a3e085bb.slice/crio-d145b24be759bc2fd1659b00135066eff43fe58281e8b8291c5162a84d43732e WatchSource:0}: Error finding container d145b24be759bc2fd1659b00135066eff43fe58281e8b8291c5162a84d43732e: Status 404 returned error can't find the container with id d145b24be759bc2fd1659b00135066eff43fe58281e8b8291c5162a84d43732e Apr 17 09:18:26.147741 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:26.147704 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-49gpd" event={"ID":"b685fde2-86c3-4314-b50e-ec33a3e085bb","Type":"ContainerStarted","Data":"18a0a13ef78ab4db81dbba594b4e966d92b7cdde5b8f6ac319acbcb350b3fe1c"} Apr 17 09:18:26.147741 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:26.147743 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-49gpd" event={"ID":"b685fde2-86c3-4314-b50e-ec33a3e085bb","Type":"ContainerStarted","Data":"d145b24be759bc2fd1659b00135066eff43fe58281e8b8291c5162a84d43732e"} Apr 17 09:18:26.166581 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:26.166540 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-49gpd" podStartSLOduration=1.166526505 podStartE2EDuration="1.166526505s" podCreationTimestamp="2026-04-17 09:18:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 09:18:26.164257211 +0000 UTC m=+431.066314590" watchObservedRunningTime="2026-04-17 09:18:26.166526505 +0000 UTC m=+431.068583886" Apr 17 09:18:26.505585 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:26.505552 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78erh5tl"] Apr 17 09:18:26.509005 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:26.508984 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78erh5tl" Apr 17 09:18:26.511535 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:26.511508 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 09:18:26.511647 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:26.511556 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 09:18:26.511647 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:26.511585 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-twc4k\"" Apr 17 09:18:26.519649 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:26.519628 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78erh5tl"] Apr 17 09:18:26.593084 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:26.593054 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78erh5tl\" (UID: \"af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78erh5tl" Apr 17 09:18:26.593226 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:26.593128 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78erh5tl\" (UID: \"af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78erh5tl" Apr 17 09:18:26.593226 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:26.593146 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkzkt\" (UniqueName: \"kubernetes.io/projected/af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb-kube-api-access-xkzkt\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78erh5tl\" (UID: \"af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78erh5tl" Apr 17 09:18:26.694214 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:26.694191 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78erh5tl\" (UID: \"af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78erh5tl" Apr 17 09:18:26.694365 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:26.694219 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xkzkt\" (UniqueName: \"kubernetes.io/projected/af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb-kube-api-access-xkzkt\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78erh5tl\" (UID: \"af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78erh5tl" Apr 17 09:18:26.694365 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:26.694261 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78erh5tl\" (UID: \"af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78erh5tl" Apr 17 09:18:26.694569 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:26.694551 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78erh5tl\" (UID: \"af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78erh5tl" Apr 17 09:18:26.694611 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:26.694589 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78erh5tl\" (UID: \"af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78erh5tl" Apr 17 09:18:26.706293 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:26.706269 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkzkt\" (UniqueName: \"kubernetes.io/projected/af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb-kube-api-access-xkzkt\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78erh5tl\" (UID: \"af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78erh5tl" Apr 17 09:18:26.819498 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:26.819428 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78erh5tl" Apr 17 09:18:26.936596 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:26.936577 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78erh5tl"] Apr 17 09:18:26.938774 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:18:26.938746 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf7fe8c7_7527_4ebf_9d9e_1dfea15c49fb.slice/crio-0faaf89f487f0be87f5197ecfae47c4235ef6177b0000303e8a7531fdfac2a63 WatchSource:0}: Error finding container 0faaf89f487f0be87f5197ecfae47c4235ef6177b0000303e8a7531fdfac2a63: Status 404 returned error can't find the container with id 0faaf89f487f0be87f5197ecfae47c4235ef6177b0000303e8a7531fdfac2a63 Apr 17 09:18:27.152606 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:27.152530 2578 generic.go:358] "Generic (PLEG): container finished" podID="af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb" containerID="ea3b220b61f5d69a304cf015b047ff41c7c54fbf4e5bb3f7c452d072770990ef" exitCode=0 Apr 17 09:18:27.152749 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:27.152602 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78erh5tl" event={"ID":"af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb","Type":"ContainerDied","Data":"ea3b220b61f5d69a304cf015b047ff41c7c54fbf4e5bb3f7c452d072770990ef"} Apr 17 09:18:27.152749 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:27.152634 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78erh5tl" event={"ID":"af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb","Type":"ContainerStarted","Data":"0faaf89f487f0be87f5197ecfae47c4235ef6177b0000303e8a7531fdfac2a63"} Apr 17 09:18:29.161931 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:29.161905 2578 generic.go:358] "Generic (PLEG): container finished" podID="af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb" containerID="67d2fdd8170a2a4c2fa9b8095a9e2e12fcc6c307885e621ebce81c07fe05537a" exitCode=0 Apr 17 09:18:29.162232 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:29.161989 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78erh5tl" event={"ID":"af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb","Type":"ContainerDied","Data":"67d2fdd8170a2a4c2fa9b8095a9e2e12fcc6c307885e621ebce81c07fe05537a"} Apr 17 09:18:30.167047 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:30.167009 2578 generic.go:358] "Generic (PLEG): container finished" podID="af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb" containerID="057f350c6b8dec308fee283764636a956ee54507c84900767d3e0f3722658fc3" exitCode=0 Apr 17 09:18:30.167400 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:30.167082 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78erh5tl" event={"ID":"af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb","Type":"ContainerDied","Data":"057f350c6b8dec308fee283764636a956ee54507c84900767d3e0f3722658fc3"} Apr 17 09:18:31.285822 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:31.285801 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78erh5tl" Apr 17 09:18:31.432626 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:31.432551 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkzkt\" (UniqueName: \"kubernetes.io/projected/af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb-kube-api-access-xkzkt\") pod \"af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb\" (UID: \"af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb\") " Apr 17 09:18:31.432626 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:31.432593 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb-bundle\") pod \"af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb\" (UID: \"af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb\") " Apr 17 09:18:31.432864 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:31.432715 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb-util\") pod \"af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb\" (UID: \"af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb\") " Apr 17 09:18:31.433040 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:31.433020 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb-bundle" (OuterVolumeSpecName: "bundle") pod "af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb" (UID: "af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 09:18:31.434616 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:31.434596 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb-kube-api-access-xkzkt" (OuterVolumeSpecName: "kube-api-access-xkzkt") pod "af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb" (UID: "af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb"). InnerVolumeSpecName "kube-api-access-xkzkt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 09:18:31.440372 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:31.440331 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb-util" (OuterVolumeSpecName: "util") pod "af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb" (UID: "af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 09:18:31.534194 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:31.534169 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xkzkt\" (UniqueName: \"kubernetes.io/projected/af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb-kube-api-access-xkzkt\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:18:31.534194 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:31.534192 2578 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb-bundle\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:18:31.534321 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:31.534201 2578 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb-util\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:18:32.174622 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:32.174595 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78erh5tl" Apr 17 09:18:32.174781 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:32.174597 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78erh5tl" event={"ID":"af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb","Type":"ContainerDied","Data":"0faaf89f487f0be87f5197ecfae47c4235ef6177b0000303e8a7531fdfac2a63"} Apr 17 09:18:32.174781 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:18:32.174699 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0faaf89f487f0be87f5197ecfae47c4235ef6177b0000303e8a7531fdfac2a63" Apr 17 09:19:34.887714 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:19:34.887680 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-g7cc6/must-gather-tt6mb"] Apr 17 09:19:34.888218 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:19:34.888045 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb" containerName="util" Apr 17 09:19:34.888218 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:19:34.888057 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb" containerName="util" Apr 17 09:19:34.888218 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:19:34.888065 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb" containerName="pull" Apr 17 09:19:34.888218 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:19:34.888070 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb" containerName="pull" Apr 17 09:19:34.888218 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:19:34.888085 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb" containerName="extract" Apr 17 09:19:34.888218 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:19:34.888091 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb" containerName="extract" Apr 17 09:19:34.888218 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:19:34.888143 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="af7fe8c7-7527-4ebf-9d9e-1dfea15c49fb" containerName="extract" Apr 17 09:19:34.891042 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:19:34.891025 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g7cc6/must-gather-tt6mb" Apr 17 09:19:34.893663 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:19:34.893636 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-g7cc6\"/\"openshift-service-ca.crt\"" Apr 17 09:19:34.893796 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:19:34.893779 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-g7cc6\"/\"default-dockercfg-7lpf7\"" Apr 17 09:19:34.893896 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:19:34.893789 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-g7cc6\"/\"kube-root-ca.crt\"" Apr 17 09:19:34.899892 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:19:34.899869 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-g7cc6/must-gather-tt6mb"] Apr 17 09:19:35.027410 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:19:35.027371 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d426c725-2a21-4ff6-8ee2-030551453cc8-must-gather-output\") pod \"must-gather-tt6mb\" (UID: \"d426c725-2a21-4ff6-8ee2-030551453cc8\") " pod="openshift-must-gather-g7cc6/must-gather-tt6mb" Apr 17 09:19:35.027606 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:19:35.027426 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbxj5\" (UniqueName: \"kubernetes.io/projected/d426c725-2a21-4ff6-8ee2-030551453cc8-kube-api-access-qbxj5\") pod \"must-gather-tt6mb\" (UID: \"d426c725-2a21-4ff6-8ee2-030551453cc8\") " pod="openshift-must-gather-g7cc6/must-gather-tt6mb" Apr 17 09:19:35.128739 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:19:35.128701 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d426c725-2a21-4ff6-8ee2-030551453cc8-must-gather-output\") pod \"must-gather-tt6mb\" (UID: \"d426c725-2a21-4ff6-8ee2-030551453cc8\") " pod="openshift-must-gather-g7cc6/must-gather-tt6mb" Apr 17 09:19:35.128907 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:19:35.128758 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qbxj5\" (UniqueName: \"kubernetes.io/projected/d426c725-2a21-4ff6-8ee2-030551453cc8-kube-api-access-qbxj5\") pod \"must-gather-tt6mb\" (UID: \"d426c725-2a21-4ff6-8ee2-030551453cc8\") " pod="openshift-must-gather-g7cc6/must-gather-tt6mb" Apr 17 09:19:35.129104 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:19:35.129080 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d426c725-2a21-4ff6-8ee2-030551453cc8-must-gather-output\") pod \"must-gather-tt6mb\" (UID: \"d426c725-2a21-4ff6-8ee2-030551453cc8\") " pod="openshift-must-gather-g7cc6/must-gather-tt6mb" Apr 17 09:19:35.138310 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:19:35.138242 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbxj5\" (UniqueName: \"kubernetes.io/projected/d426c725-2a21-4ff6-8ee2-030551453cc8-kube-api-access-qbxj5\") pod \"must-gather-tt6mb\" (UID: \"d426c725-2a21-4ff6-8ee2-030551453cc8\") " pod="openshift-must-gather-g7cc6/must-gather-tt6mb" Apr 17 09:19:35.201486 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:19:35.201452 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g7cc6/must-gather-tt6mb" Apr 17 09:19:35.321383 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:19:35.321337 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-g7cc6/must-gather-tt6mb"] Apr 17 09:19:35.323696 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:19:35.323668 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd426c725_2a21_4ff6_8ee2_030551453cc8.slice/crio-57af13de3b561f357310171161dc9301f3e9d3887aad0bb6e611ce49d71bd711 WatchSource:0}: Error finding container 57af13de3b561f357310171161dc9301f3e9d3887aad0bb6e611ce49d71bd711: Status 404 returned error can't find the container with id 57af13de3b561f357310171161dc9301f3e9d3887aad0bb6e611ce49d71bd711 Apr 17 09:19:35.389303 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:19:35.389220 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g7cc6/must-gather-tt6mb" event={"ID":"d426c725-2a21-4ff6-8ee2-030551453cc8","Type":"ContainerStarted","Data":"57af13de3b561f357310171161dc9301f3e9d3887aad0bb6e611ce49d71bd711"} Apr 17 09:19:40.413032 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:19:40.412992 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g7cc6/must-gather-tt6mb" event={"ID":"d426c725-2a21-4ff6-8ee2-030551453cc8","Type":"ContainerStarted","Data":"b02c4afb914a9e3d54e7b1eb12fd0a2f869aa2622e6404da0e1ec6b35b09d20d"} Apr 17 09:19:40.413032 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:19:40.413033 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g7cc6/must-gather-tt6mb" event={"ID":"d426c725-2a21-4ff6-8ee2-030551453cc8","Type":"ContainerStarted","Data":"4ef5fcc611f01c177ade02c49f04b58cf8e2f1e23d0d4818e3a92f49860b9774"} Apr 17 09:19:40.430535 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:19:40.430488 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-g7cc6/must-gather-tt6mb" podStartSLOduration=1.812822728 podStartE2EDuration="6.430473366s" podCreationTimestamp="2026-04-17 09:19:34 +0000 UTC" firstStartedPulling="2026-04-17 09:19:35.325276485 +0000 UTC m=+500.227333843" lastFinishedPulling="2026-04-17 09:19:39.942927123 +0000 UTC m=+504.844984481" observedRunningTime="2026-04-17 09:19:40.428382011 +0000 UTC m=+505.330439392" watchObservedRunningTime="2026-04-17 09:19:40.430473366 +0000 UTC m=+505.332530747" Apr 17 09:20:24.583213 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:24.583178 2578 generic.go:358] "Generic (PLEG): container finished" podID="d426c725-2a21-4ff6-8ee2-030551453cc8" containerID="4ef5fcc611f01c177ade02c49f04b58cf8e2f1e23d0d4818e3a92f49860b9774" exitCode=0 Apr 17 09:20:24.583647 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:24.583253 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g7cc6/must-gather-tt6mb" event={"ID":"d426c725-2a21-4ff6-8ee2-030551453cc8","Type":"ContainerDied","Data":"4ef5fcc611f01c177ade02c49f04b58cf8e2f1e23d0d4818e3a92f49860b9774"} Apr 17 09:20:24.583647 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:24.583570 2578 scope.go:117] "RemoveContainer" containerID="4ef5fcc611f01c177ade02c49f04b58cf8e2f1e23d0d4818e3a92f49860b9774" Apr 17 09:20:24.838603 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:24.838526 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-g7cc6_must-gather-tt6mb_d426c725-2a21-4ff6-8ee2-030551453cc8/gather/0.log" Apr 17 09:20:28.432676 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:28.432640 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-wc8rj_0f470e3b-3dff-4a2d-9794-fbd675ca2ae9/global-pull-secret-syncer/0.log" Apr 17 09:20:28.540814 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:28.540777 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-ht9mz_5f2e0357-0335-4771-8c1a-7da849e626c2/konnectivity-agent/0.log" Apr 17 09:20:28.593002 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:28.592974 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-130-147.ec2.internal_2e42dd9ccf4c8146ae96ed62e2dab724/haproxy/0.log" Apr 17 09:20:29.917921 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:29.917888 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5f6f8785-p8cv6"] Apr 17 09:20:29.921382 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:29.921358 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f6f8785-p8cv6" Apr 17 09:20:29.937365 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:29.937338 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f6f8785-p8cv6"] Apr 17 09:20:30.020970 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.020940 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9051c368-88c3-481a-8c38-ebde24098f56-console-serving-cert\") pod \"console-5f6f8785-p8cv6\" (UID: \"9051c368-88c3-481a-8c38-ebde24098f56\") " pod="openshift-console/console-5f6f8785-p8cv6" Apr 17 09:20:30.020970 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.020972 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9051c368-88c3-481a-8c38-ebde24098f56-oauth-serving-cert\") pod \"console-5f6f8785-p8cv6\" (UID: \"9051c368-88c3-481a-8c38-ebde24098f56\") " pod="openshift-console/console-5f6f8785-p8cv6" Apr 17 09:20:30.021176 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.021002 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9051c368-88c3-481a-8c38-ebde24098f56-trusted-ca-bundle\") pod \"console-5f6f8785-p8cv6\" (UID: \"9051c368-88c3-481a-8c38-ebde24098f56\") " pod="openshift-console/console-5f6f8785-p8cv6" Apr 17 09:20:30.021176 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.021104 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9051c368-88c3-481a-8c38-ebde24098f56-service-ca\") pod \"console-5f6f8785-p8cv6\" (UID: \"9051c368-88c3-481a-8c38-ebde24098f56\") " pod="openshift-console/console-5f6f8785-p8cv6" Apr 17 09:20:30.021176 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.021140 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9051c368-88c3-481a-8c38-ebde24098f56-console-config\") pod \"console-5f6f8785-p8cv6\" (UID: \"9051c368-88c3-481a-8c38-ebde24098f56\") " pod="openshift-console/console-5f6f8785-p8cv6" Apr 17 09:20:30.021176 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.021163 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9051c368-88c3-481a-8c38-ebde24098f56-console-oauth-config\") pod \"console-5f6f8785-p8cv6\" (UID: \"9051c368-88c3-481a-8c38-ebde24098f56\") " pod="openshift-console/console-5f6f8785-p8cv6" Apr 17 09:20:30.021299 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.021186 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fvc4\" (UniqueName: \"kubernetes.io/projected/9051c368-88c3-481a-8c38-ebde24098f56-kube-api-access-8fvc4\") pod \"console-5f6f8785-p8cv6\" (UID: \"9051c368-88c3-481a-8c38-ebde24098f56\") " pod="openshift-console/console-5f6f8785-p8cv6" Apr 17 09:20:30.121640 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.121602 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9051c368-88c3-481a-8c38-ebde24098f56-console-oauth-config\") pod \"console-5f6f8785-p8cv6\" (UID: \"9051c368-88c3-481a-8c38-ebde24098f56\") " pod="openshift-console/console-5f6f8785-p8cv6" Apr 17 09:20:30.121640 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.121644 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8fvc4\" (UniqueName: \"kubernetes.io/projected/9051c368-88c3-481a-8c38-ebde24098f56-kube-api-access-8fvc4\") pod \"console-5f6f8785-p8cv6\" (UID: \"9051c368-88c3-481a-8c38-ebde24098f56\") " pod="openshift-console/console-5f6f8785-p8cv6" Apr 17 09:20:30.121793 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.121686 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9051c368-88c3-481a-8c38-ebde24098f56-console-serving-cert\") pod \"console-5f6f8785-p8cv6\" (UID: \"9051c368-88c3-481a-8c38-ebde24098f56\") " pod="openshift-console/console-5f6f8785-p8cv6" Apr 17 09:20:30.121793 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.121700 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9051c368-88c3-481a-8c38-ebde24098f56-oauth-serving-cert\") pod \"console-5f6f8785-p8cv6\" (UID: \"9051c368-88c3-481a-8c38-ebde24098f56\") " pod="openshift-console/console-5f6f8785-p8cv6" Apr 17 09:20:30.121793 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.121729 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9051c368-88c3-481a-8c38-ebde24098f56-trusted-ca-bundle\") pod \"console-5f6f8785-p8cv6\" (UID: \"9051c368-88c3-481a-8c38-ebde24098f56\") " pod="openshift-console/console-5f6f8785-p8cv6" Apr 17 09:20:30.121793 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.121783 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9051c368-88c3-481a-8c38-ebde24098f56-service-ca\") pod \"console-5f6f8785-p8cv6\" (UID: \"9051c368-88c3-481a-8c38-ebde24098f56\") " pod="openshift-console/console-5f6f8785-p8cv6" Apr 17 09:20:30.121991 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.121814 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9051c368-88c3-481a-8c38-ebde24098f56-console-config\") pod \"console-5f6f8785-p8cv6\" (UID: \"9051c368-88c3-481a-8c38-ebde24098f56\") " pod="openshift-console/console-5f6f8785-p8cv6" Apr 17 09:20:30.122523 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.122499 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9051c368-88c3-481a-8c38-ebde24098f56-trusted-ca-bundle\") pod \"console-5f6f8785-p8cv6\" (UID: \"9051c368-88c3-481a-8c38-ebde24098f56\") " pod="openshift-console/console-5f6f8785-p8cv6" Apr 17 09:20:30.122636 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.122592 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9051c368-88c3-481a-8c38-ebde24098f56-console-config\") pod \"console-5f6f8785-p8cv6\" (UID: \"9051c368-88c3-481a-8c38-ebde24098f56\") " pod="openshift-console/console-5f6f8785-p8cv6" Apr 17 09:20:30.122636 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.122592 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9051c368-88c3-481a-8c38-ebde24098f56-service-ca\") pod \"console-5f6f8785-p8cv6\" (UID: \"9051c368-88c3-481a-8c38-ebde24098f56\") " pod="openshift-console/console-5f6f8785-p8cv6" Apr 17 09:20:30.122753 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.122678 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9051c368-88c3-481a-8c38-ebde24098f56-oauth-serving-cert\") pod \"console-5f6f8785-p8cv6\" (UID: \"9051c368-88c3-481a-8c38-ebde24098f56\") " pod="openshift-console/console-5f6f8785-p8cv6" Apr 17 09:20:30.124284 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.124262 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9051c368-88c3-481a-8c38-ebde24098f56-console-serving-cert\") pod \"console-5f6f8785-p8cv6\" (UID: \"9051c368-88c3-481a-8c38-ebde24098f56\") " pod="openshift-console/console-5f6f8785-p8cv6" Apr 17 09:20:30.124360 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.124262 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9051c368-88c3-481a-8c38-ebde24098f56-console-oauth-config\") pod \"console-5f6f8785-p8cv6\" (UID: \"9051c368-88c3-481a-8c38-ebde24098f56\") " pod="openshift-console/console-5f6f8785-p8cv6" Apr 17 09:20:30.134498 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.134475 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fvc4\" (UniqueName: \"kubernetes.io/projected/9051c368-88c3-481a-8c38-ebde24098f56-kube-api-access-8fvc4\") pod \"console-5f6f8785-p8cv6\" (UID: \"9051c368-88c3-481a-8c38-ebde24098f56\") " pod="openshift-console/console-5f6f8785-p8cv6" Apr 17 09:20:30.195249 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.195193 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-g7cc6/must-gather-tt6mb"] Apr 17 09:20:30.195415 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.195393 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-g7cc6/must-gather-tt6mb" podUID="d426c725-2a21-4ff6-8ee2-030551453cc8" containerName="copy" containerID="cri-o://b02c4afb914a9e3d54e7b1eb12fd0a2f869aa2622e6404da0e1ec6b35b09d20d" gracePeriod=2 Apr 17 09:20:30.202621 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.202596 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-g7cc6/must-gather-tt6mb"] Apr 17 09:20:30.230188 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.230162 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f6f8785-p8cv6" Apr 17 09:20:30.357763 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.357738 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f6f8785-p8cv6"] Apr 17 09:20:30.360341 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:20:30.360309 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9051c368_88c3_481a_8c38_ebde24098f56.slice/crio-f173f3367f83ac037dcc908a080a4f39d5fdcee3a21be9d78cbd9df0528103fc WatchSource:0}: Error finding container f173f3367f83ac037dcc908a080a4f39d5fdcee3a21be9d78cbd9df0528103fc: Status 404 returned error can't find the container with id f173f3367f83ac037dcc908a080a4f39d5fdcee3a21be9d78cbd9df0528103fc Apr 17 09:20:30.424574 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.424549 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-g7cc6_must-gather-tt6mb_d426c725-2a21-4ff6-8ee2-030551453cc8/copy/0.log" Apr 17 09:20:30.424930 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.424915 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g7cc6/must-gather-tt6mb" Apr 17 09:20:30.427102 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.427076 2578 status_manager.go:895] "Failed to get status for pod" podUID="d426c725-2a21-4ff6-8ee2-030551453cc8" pod="openshift-must-gather-g7cc6/must-gather-tt6mb" err="pods \"must-gather-tt6mb\" is forbidden: User \"system:node:ip-10-0-130-147.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-g7cc6\": no relationship found between node 'ip-10-0-130-147.ec2.internal' and this object" Apr 17 09:20:30.526014 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.525982 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d426c725-2a21-4ff6-8ee2-030551453cc8-must-gather-output\") pod \"d426c725-2a21-4ff6-8ee2-030551453cc8\" (UID: \"d426c725-2a21-4ff6-8ee2-030551453cc8\") " Apr 17 09:20:30.526170 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.526076 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbxj5\" (UniqueName: \"kubernetes.io/projected/d426c725-2a21-4ff6-8ee2-030551453cc8-kube-api-access-qbxj5\") pod \"d426c725-2a21-4ff6-8ee2-030551453cc8\" (UID: \"d426c725-2a21-4ff6-8ee2-030551453cc8\") " Apr 17 09:20:30.527386 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.527362 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d426c725-2a21-4ff6-8ee2-030551453cc8-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "d426c725-2a21-4ff6-8ee2-030551453cc8" (UID: "d426c725-2a21-4ff6-8ee2-030551453cc8"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 09:20:30.528225 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.528202 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d426c725-2a21-4ff6-8ee2-030551453cc8-kube-api-access-qbxj5" (OuterVolumeSpecName: "kube-api-access-qbxj5") pod "d426c725-2a21-4ff6-8ee2-030551453cc8" (UID: "d426c725-2a21-4ff6-8ee2-030551453cc8"). InnerVolumeSpecName "kube-api-access-qbxj5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 09:20:30.606166 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.606129 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f6f8785-p8cv6" event={"ID":"9051c368-88c3-481a-8c38-ebde24098f56","Type":"ContainerStarted","Data":"bf522eccb34f82aff69db6229cb6284b0bae71b7dc7a18d44553b44a959b5129"} Apr 17 09:20:30.606336 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.606173 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f6f8785-p8cv6" event={"ID":"9051c368-88c3-481a-8c38-ebde24098f56","Type":"ContainerStarted","Data":"f173f3367f83ac037dcc908a080a4f39d5fdcee3a21be9d78cbd9df0528103fc"} Apr 17 09:20:30.607195 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.607178 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-g7cc6_must-gather-tt6mb_d426c725-2a21-4ff6-8ee2-030551453cc8/copy/0.log" Apr 17 09:20:30.607464 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.607444 2578 generic.go:358] "Generic (PLEG): container finished" podID="d426c725-2a21-4ff6-8ee2-030551453cc8" containerID="b02c4afb914a9e3d54e7b1eb12fd0a2f869aa2622e6404da0e1ec6b35b09d20d" exitCode=143 Apr 17 09:20:30.607519 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.607487 2578 scope.go:117] "RemoveContainer" containerID="b02c4afb914a9e3d54e7b1eb12fd0a2f869aa2622e6404da0e1ec6b35b09d20d" Apr 17 09:20:30.607557 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.607488 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g7cc6/must-gather-tt6mb" Apr 17 09:20:30.615609 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.615594 2578 scope.go:117] "RemoveContainer" containerID="4ef5fcc611f01c177ade02c49f04b58cf8e2f1e23d0d4818e3a92f49860b9774" Apr 17 09:20:30.626566 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.626544 2578 scope.go:117] "RemoveContainer" containerID="b02c4afb914a9e3d54e7b1eb12fd0a2f869aa2622e6404da0e1ec6b35b09d20d" Apr 17 09:20:30.626667 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.626646 2578 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d426c725-2a21-4ff6-8ee2-030551453cc8-must-gather-output\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:20:30.626731 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.626673 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qbxj5\" (UniqueName: \"kubernetes.io/projected/d426c725-2a21-4ff6-8ee2-030551453cc8-kube-api-access-qbxj5\") on node \"ip-10-0-130-147.ec2.internal\" DevicePath \"\"" Apr 17 09:20:30.626879 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:20:30.626857 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b02c4afb914a9e3d54e7b1eb12fd0a2f869aa2622e6404da0e1ec6b35b09d20d\": container with ID starting with b02c4afb914a9e3d54e7b1eb12fd0a2f869aa2622e6404da0e1ec6b35b09d20d not found: ID does not exist" containerID="b02c4afb914a9e3d54e7b1eb12fd0a2f869aa2622e6404da0e1ec6b35b09d20d" Apr 17 09:20:30.626933 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.626888 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b02c4afb914a9e3d54e7b1eb12fd0a2f869aa2622e6404da0e1ec6b35b09d20d"} err="failed to get container status \"b02c4afb914a9e3d54e7b1eb12fd0a2f869aa2622e6404da0e1ec6b35b09d20d\": rpc error: code = NotFound desc = could not find container \"b02c4afb914a9e3d54e7b1eb12fd0a2f869aa2622e6404da0e1ec6b35b09d20d\": container with ID starting with b02c4afb914a9e3d54e7b1eb12fd0a2f869aa2622e6404da0e1ec6b35b09d20d not found: ID does not exist" Apr 17 09:20:30.626933 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.626921 2578 scope.go:117] "RemoveContainer" containerID="4ef5fcc611f01c177ade02c49f04b58cf8e2f1e23d0d4818e3a92f49860b9774" Apr 17 09:20:30.627184 ip-10-0-130-147 kubenswrapper[2578]: E0417 09:20:30.627164 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ef5fcc611f01c177ade02c49f04b58cf8e2f1e23d0d4818e3a92f49860b9774\": container with ID starting with 4ef5fcc611f01c177ade02c49f04b58cf8e2f1e23d0d4818e3a92f49860b9774 not found: ID does not exist" containerID="4ef5fcc611f01c177ade02c49f04b58cf8e2f1e23d0d4818e3a92f49860b9774" Apr 17 09:20:30.627240 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.627195 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ef5fcc611f01c177ade02c49f04b58cf8e2f1e23d0d4818e3a92f49860b9774"} err="failed to get container status \"4ef5fcc611f01c177ade02c49f04b58cf8e2f1e23d0d4818e3a92f49860b9774\": rpc error: code = NotFound desc = could not find container \"4ef5fcc611f01c177ade02c49f04b58cf8e2f1e23d0d4818e3a92f49860b9774\": container with ID starting with 4ef5fcc611f01c177ade02c49f04b58cf8e2f1e23d0d4818e3a92f49860b9774 not found: ID does not exist" Apr 17 09:20:30.627712 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.627670 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5f6f8785-p8cv6" podStartSLOduration=1.627657847 podStartE2EDuration="1.627657847s" podCreationTimestamp="2026-04-17 09:20:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 09:20:30.625569086 +0000 UTC m=+555.527626466" watchObservedRunningTime="2026-04-17 09:20:30.627657847 +0000 UTC m=+555.529715230" Apr 17 09:20:30.627796 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.627690 2578 status_manager.go:895] "Failed to get status for pod" podUID="d426c725-2a21-4ff6-8ee2-030551453cc8" pod="openshift-must-gather-g7cc6/must-gather-tt6mb" err="pods \"must-gather-tt6mb\" is forbidden: User \"system:node:ip-10-0-130-147.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-g7cc6\": no relationship found between node 'ip-10-0-130-147.ec2.internal' and this object" Apr 17 09:20:30.629890 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:30.629869 2578 status_manager.go:895] "Failed to get status for pod" podUID="d426c725-2a21-4ff6-8ee2-030551453cc8" pod="openshift-must-gather-g7cc6/must-gather-tt6mb" err="pods \"must-gather-tt6mb\" is forbidden: User \"system:node:ip-10-0-130-147.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-g7cc6\": no relationship found between node 'ip-10-0-130-147.ec2.internal' and this object" Apr 17 09:20:31.660067 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:31.660038 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d426c725-2a21-4ff6-8ee2-030551453cc8" path="/var/lib/kubelet/pods/d426c725-2a21-4ff6-8ee2-030551453cc8/volumes" Apr 17 09:20:31.903003 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:31.902969 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_feb2874c-8009-4328-a602-437bc8212b5a/alertmanager/0.log" Apr 17 09:20:31.929443 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:31.929381 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_feb2874c-8009-4328-a602-437bc8212b5a/config-reloader/0.log" Apr 17 09:20:31.955097 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:31.955076 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_feb2874c-8009-4328-a602-437bc8212b5a/kube-rbac-proxy-web/0.log" Apr 17 09:20:31.981260 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:31.981238 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_feb2874c-8009-4328-a602-437bc8212b5a/kube-rbac-proxy/0.log" Apr 17 09:20:32.009784 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:32.009759 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_feb2874c-8009-4328-a602-437bc8212b5a/kube-rbac-proxy-metric/0.log" Apr 17 09:20:32.035215 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:32.035191 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_feb2874c-8009-4328-a602-437bc8212b5a/prom-label-proxy/0.log" Apr 17 09:20:32.063222 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:32.063203 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_feb2874c-8009-4328-a602-437bc8212b5a/init-config-reloader/0.log" Apr 17 09:20:32.102160 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:32.102131 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-tkbrp_1e2a7bc9-c500-464d-a161-c668f67f1430/cluster-monitoring-operator/0.log" Apr 17 09:20:32.127137 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:32.127116 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-95wqk_fa4648bd-7936-480a-ab85-3130c7a997c6/kube-state-metrics/0.log" Apr 17 09:20:32.149830 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:32.149810 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-95wqk_fa4648bd-7936-480a-ab85-3130c7a997c6/kube-rbac-proxy-main/0.log" Apr 17 09:20:32.171891 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:32.171868 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-95wqk_fa4648bd-7936-480a-ab85-3130c7a997c6/kube-rbac-proxy-self/0.log" Apr 17 09:20:32.200321 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:32.200273 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-884d56797-hgwvz_714bba7b-0745-4cc4-8d29-654fd2dd34f3/metrics-server/0.log" Apr 17 09:20:32.229904 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:32.229885 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-g424c_1ed627b8-2f3b-4512-bf7a-1110aa6c22b5/monitoring-plugin/0.log" Apr 17 09:20:32.265382 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:32.265363 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bt9x7_11b40bf5-a000-48d0-8a8c-3011e6e7249c/node-exporter/0.log" Apr 17 09:20:32.286461 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:32.286444 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bt9x7_11b40bf5-a000-48d0-8a8c-3011e6e7249c/kube-rbac-proxy/0.log" Apr 17 09:20:32.308567 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:32.308551 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bt9x7_11b40bf5-a000-48d0-8a8c-3011e6e7249c/init-textfile/0.log" Apr 17 09:20:32.520565 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:32.520534 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-gc4f5_5e1ed79e-f41a-49ad-ab05-2843eaef7806/kube-rbac-proxy-main/0.log" Apr 17 09:20:32.559986 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:32.559963 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-gc4f5_5e1ed79e-f41a-49ad-ab05-2843eaef7806/kube-rbac-proxy-self/0.log" Apr 17 09:20:32.584740 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:32.584722 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-gc4f5_5e1ed79e-f41a-49ad-ab05-2843eaef7806/openshift-state-metrics/0.log" Apr 17 09:20:32.884505 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:32.884432 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7bfb545dc8-4g5kk_cdb69cfc-28ac-4182-a68a-432310e1dad2/telemeter-client/0.log" Apr 17 09:20:32.915648 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:32.915608 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7bfb545dc8-4g5kk_cdb69cfc-28ac-4182-a68a-432310e1dad2/reload/0.log" Apr 17 09:20:32.946407 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:32.946384 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7bfb545dc8-4g5kk_cdb69cfc-28ac-4182-a68a-432310e1dad2/kube-rbac-proxy/0.log" Apr 17 09:20:34.901735 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:34.901706 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5f6f8785-p8cv6_9051c368-88c3-481a-8c38-ebde24098f56/console/0.log" Apr 17 09:20:34.927031 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:34.927004 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-859c59bdc7-bx7tg_addbe683-fd42-4f4c-a595-2fa6f37d3250/console/0.log" Apr 17 09:20:34.958377 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:34.958351 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-knk5v_276b809a-2684-4b9d-9c50-dcc32d5cbe03/download-server/0.log" Apr 17 09:20:35.135773 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:35.135737 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-v8842/perf-node-gather-daemonset-mb2tv"] Apr 17 09:20:35.136174 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:35.136156 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d426c725-2a21-4ff6-8ee2-030551453cc8" containerName="gather" Apr 17 09:20:35.136261 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:35.136176 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d426c725-2a21-4ff6-8ee2-030551453cc8" containerName="gather" Apr 17 09:20:35.136261 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:35.136204 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d426c725-2a21-4ff6-8ee2-030551453cc8" containerName="copy" Apr 17 09:20:35.136261 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:35.136212 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d426c725-2a21-4ff6-8ee2-030551453cc8" containerName="copy" Apr 17 09:20:35.136397 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:35.136321 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="d426c725-2a21-4ff6-8ee2-030551453cc8" containerName="copy" Apr 17 09:20:35.136397 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:35.136339 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="d426c725-2a21-4ff6-8ee2-030551453cc8" containerName="gather" Apr 17 09:20:35.139485 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:35.139463 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v8842/perf-node-gather-daemonset-mb2tv" Apr 17 09:20:35.144648 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:35.144625 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-v8842\"/\"kube-root-ca.crt\"" Apr 17 09:20:35.144755 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:35.144718 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-v8842\"/\"default-dockercfg-mxnlg\"" Apr 17 09:20:35.145979 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:35.145957 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-v8842\"/\"openshift-service-ca.crt\"" Apr 17 09:20:35.154981 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:35.154933 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-v8842/perf-node-gather-daemonset-mb2tv"] Apr 17 09:20:35.269133 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:35.269105 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr2c7\" (UniqueName: \"kubernetes.io/projected/061980c7-be2b-4bdb-b7ee-ad72b90639a2-kube-api-access-rr2c7\") pod \"perf-node-gather-daemonset-mb2tv\" (UID: \"061980c7-be2b-4bdb-b7ee-ad72b90639a2\") " pod="openshift-must-gather-v8842/perf-node-gather-daemonset-mb2tv" Apr 17 09:20:35.269265 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:35.269149 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/061980c7-be2b-4bdb-b7ee-ad72b90639a2-podres\") pod \"perf-node-gather-daemonset-mb2tv\" (UID: \"061980c7-be2b-4bdb-b7ee-ad72b90639a2\") " pod="openshift-must-gather-v8842/perf-node-gather-daemonset-mb2tv" Apr 17 09:20:35.269265 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:35.269204 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/061980c7-be2b-4bdb-b7ee-ad72b90639a2-sys\") pod \"perf-node-gather-daemonset-mb2tv\" (UID: \"061980c7-be2b-4bdb-b7ee-ad72b90639a2\") " pod="openshift-must-gather-v8842/perf-node-gather-daemonset-mb2tv" Apr 17 09:20:35.269384 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:35.269287 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/061980c7-be2b-4bdb-b7ee-ad72b90639a2-proc\") pod \"perf-node-gather-daemonset-mb2tv\" (UID: \"061980c7-be2b-4bdb-b7ee-ad72b90639a2\") " pod="openshift-must-gather-v8842/perf-node-gather-daemonset-mb2tv" Apr 17 09:20:35.269384 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:35.269364 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/061980c7-be2b-4bdb-b7ee-ad72b90639a2-lib-modules\") pod \"perf-node-gather-daemonset-mb2tv\" (UID: \"061980c7-be2b-4bdb-b7ee-ad72b90639a2\") " pod="openshift-must-gather-v8842/perf-node-gather-daemonset-mb2tv" Apr 17 09:20:35.370500 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:35.370470 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/061980c7-be2b-4bdb-b7ee-ad72b90639a2-podres\") pod \"perf-node-gather-daemonset-mb2tv\" (UID: \"061980c7-be2b-4bdb-b7ee-ad72b90639a2\") " pod="openshift-must-gather-v8842/perf-node-gather-daemonset-mb2tv" Apr 17 09:20:35.370636 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:35.370509 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/061980c7-be2b-4bdb-b7ee-ad72b90639a2-sys\") pod \"perf-node-gather-daemonset-mb2tv\" (UID: \"061980c7-be2b-4bdb-b7ee-ad72b90639a2\") " pod="openshift-must-gather-v8842/perf-node-gather-daemonset-mb2tv" Apr 17 09:20:35.370636 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:35.370531 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/061980c7-be2b-4bdb-b7ee-ad72b90639a2-proc\") pod \"perf-node-gather-daemonset-mb2tv\" (UID: \"061980c7-be2b-4bdb-b7ee-ad72b90639a2\") " pod="openshift-must-gather-v8842/perf-node-gather-daemonset-mb2tv" Apr 17 09:20:35.370636 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:35.370575 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/061980c7-be2b-4bdb-b7ee-ad72b90639a2-lib-modules\") pod \"perf-node-gather-daemonset-mb2tv\" (UID: \"061980c7-be2b-4bdb-b7ee-ad72b90639a2\") " pod="openshift-must-gather-v8842/perf-node-gather-daemonset-mb2tv" Apr 17 09:20:35.370636 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:35.370593 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rr2c7\" (UniqueName: \"kubernetes.io/projected/061980c7-be2b-4bdb-b7ee-ad72b90639a2-kube-api-access-rr2c7\") pod \"perf-node-gather-daemonset-mb2tv\" (UID: \"061980c7-be2b-4bdb-b7ee-ad72b90639a2\") " pod="openshift-must-gather-v8842/perf-node-gather-daemonset-mb2tv" Apr 17 09:20:35.370636 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:35.370602 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/061980c7-be2b-4bdb-b7ee-ad72b90639a2-sys\") pod \"perf-node-gather-daemonset-mb2tv\" (UID: \"061980c7-be2b-4bdb-b7ee-ad72b90639a2\") " pod="openshift-must-gather-v8842/perf-node-gather-daemonset-mb2tv" Apr 17 09:20:35.370860 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:35.370643 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/061980c7-be2b-4bdb-b7ee-ad72b90639a2-podres\") pod \"perf-node-gather-daemonset-mb2tv\" (UID: \"061980c7-be2b-4bdb-b7ee-ad72b90639a2\") " pod="openshift-must-gather-v8842/perf-node-gather-daemonset-mb2tv" Apr 17 09:20:35.370860 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:35.370683 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/061980c7-be2b-4bdb-b7ee-ad72b90639a2-proc\") pod \"perf-node-gather-daemonset-mb2tv\" (UID: \"061980c7-be2b-4bdb-b7ee-ad72b90639a2\") " pod="openshift-must-gather-v8842/perf-node-gather-daemonset-mb2tv" Apr 17 09:20:35.370860 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:35.370738 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/061980c7-be2b-4bdb-b7ee-ad72b90639a2-lib-modules\") pod \"perf-node-gather-daemonset-mb2tv\" (UID: \"061980c7-be2b-4bdb-b7ee-ad72b90639a2\") " pod="openshift-must-gather-v8842/perf-node-gather-daemonset-mb2tv" Apr 17 09:20:35.381125 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:35.381109 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr2c7\" (UniqueName: \"kubernetes.io/projected/061980c7-be2b-4bdb-b7ee-ad72b90639a2-kube-api-access-rr2c7\") pod \"perf-node-gather-daemonset-mb2tv\" (UID: \"061980c7-be2b-4bdb-b7ee-ad72b90639a2\") " pod="openshift-must-gather-v8842/perf-node-gather-daemonset-mb2tv" Apr 17 09:20:35.449563 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:35.449500 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v8842/perf-node-gather-daemonset-mb2tv" Apr 17 09:20:35.571736 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:35.571562 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-v8842/perf-node-gather-daemonset-mb2tv"] Apr 17 09:20:35.574052 ip-10-0-130-147 kubenswrapper[2578]: W0417 09:20:35.574020 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod061980c7_be2b_4bdb_b7ee_ad72b90639a2.slice/crio-d3144a79c118e54b72dee0f1a584676046ae613a3c34aaa683a46d365bc5b358 WatchSource:0}: Error finding container d3144a79c118e54b72dee0f1a584676046ae613a3c34aaa683a46d365bc5b358: Status 404 returned error can't find the container with id d3144a79c118e54b72dee0f1a584676046ae613a3c34aaa683a46d365bc5b358 Apr 17 09:20:35.626046 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:35.626020 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v8842/perf-node-gather-daemonset-mb2tv" event={"ID":"061980c7-be2b-4bdb-b7ee-ad72b90639a2","Type":"ContainerStarted","Data":"d3144a79c118e54b72dee0f1a584676046ae613a3c34aaa683a46d365bc5b358"} Apr 17 09:20:36.143933 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:36.143888 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-mw8fd_b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5/dns/0.log" Apr 17 09:20:36.168550 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:36.168525 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-mw8fd_b4cf6a67-48da-489c-8f76-ffb0dd4d5fe5/kube-rbac-proxy/0.log" Apr 17 09:20:36.244425 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:36.244397 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-7lzx7_8e70ed4d-e5a1-4a10-931b-32fe40414a5a/dns-node-resolver/0.log" Apr 17 09:20:36.630936 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:36.630905 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v8842/perf-node-gather-daemonset-mb2tv" event={"ID":"061980c7-be2b-4bdb-b7ee-ad72b90639a2","Type":"ContainerStarted","Data":"da6f0f7643b6acb641ef2c88b2d347bd07459f8c849670bf9bf03ff62fa529c9"} Apr 17 09:20:36.631101 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:36.631028 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-v8842/perf-node-gather-daemonset-mb2tv" Apr 17 09:20:36.648706 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:36.648668 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-v8842/perf-node-gather-daemonset-mb2tv" podStartSLOduration=1.648657414 podStartE2EDuration="1.648657414s" podCreationTimestamp="2026-04-17 09:20:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 09:20:36.648631788 +0000 UTC m=+561.550689168" watchObservedRunningTime="2026-04-17 09:20:36.648657414 +0000 UTC m=+561.550714794" Apr 17 09:20:36.732986 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:36.732961 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-mwm8s_054fc5ee-b86e-42a4-85c2-322e7ca088cf/node-ca/0.log" Apr 17 09:20:37.808833 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:37.808803 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-jmvr9_a0becf09-e2a4-4fea-a602-f69826ef0f66/serve-healthcheck-canary/0.log" Apr 17 09:20:38.184115 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:38.184028 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-rlbzq_df06ee4d-da4b-4812-876f-8b39a0419cca/insights-operator/1.log" Apr 17 09:20:38.184920 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:38.184889 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-rlbzq_df06ee4d-da4b-4812-876f-8b39a0419cca/insights-operator/0.log" Apr 17 09:20:38.346788 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:38.346761 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fmzk9_a95dc232-5251-4fa8-b5fe-cd5f8fbb5d6f/kube-rbac-proxy/0.log" Apr 17 09:20:38.368292 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:38.368273 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fmzk9_a95dc232-5251-4fa8-b5fe-cd5f8fbb5d6f/exporter/0.log" Apr 17 09:20:38.390576 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:38.390544 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fmzk9_a95dc232-5251-4fa8-b5fe-cd5f8fbb5d6f/extractor/0.log" Apr 17 09:20:40.230930 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:40.230884 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5f6f8785-p8cv6" Apr 17 09:20:40.230930 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:40.230942 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5f6f8785-p8cv6" Apr 17 09:20:40.235316 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:40.235293 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5f6f8785-p8cv6" Apr 17 09:20:40.648719 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:40.648648 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5f6f8785-p8cv6" Apr 17 09:20:40.693657 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:40.693631 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-859c59bdc7-bx7tg"] Apr 17 09:20:42.644145 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:42.644112 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-v8842/perf-node-gather-daemonset-mb2tv" Apr 17 09:20:43.278530 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:43.278498 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-2tbn5_7ce72122-fc1c-425c-84e0-f0f52bc442e2/migrator/0.log" Apr 17 09:20:43.299096 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:43.299078 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-2tbn5_7ce72122-fc1c-425c-84e0-f0f52bc442e2/graceful-termination/0.log" Apr 17 09:20:44.697742 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:44.697713 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4p8q2_164bdfae-ab57-4679-8440-11f5f905aca9/kube-multus/0.log" Apr 17 09:20:44.934971 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:44.934903 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7hzg2_e6550303-873c-4278-9d7c-1b6d17d5f9eb/kube-multus-additional-cni-plugins/0.log" Apr 17 09:20:44.959427 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:44.959404 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7hzg2_e6550303-873c-4278-9d7c-1b6d17d5f9eb/egress-router-binary-copy/0.log" Apr 17 09:20:44.981178 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:44.981159 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7hzg2_e6550303-873c-4278-9d7c-1b6d17d5f9eb/cni-plugins/0.log" Apr 17 09:20:45.008103 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:45.008079 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7hzg2_e6550303-873c-4278-9d7c-1b6d17d5f9eb/bond-cni-plugin/0.log" Apr 17 09:20:45.033538 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:45.033515 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7hzg2_e6550303-873c-4278-9d7c-1b6d17d5f9eb/routeoverride-cni/0.log" Apr 17 09:20:45.057631 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:45.057610 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7hzg2_e6550303-873c-4278-9d7c-1b6d17d5f9eb/whereabouts-cni-bincopy/0.log" Apr 17 09:20:45.079940 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:45.079921 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7hzg2_e6550303-873c-4278-9d7c-1b6d17d5f9eb/whereabouts-cni/0.log" Apr 17 09:20:45.354198 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:45.354167 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4h6v9_ea2ee429-d7fa-4703-99bd-5d963ebab30c/network-metrics-daemon/0.log" Apr 17 09:20:45.375563 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:45.375540 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4h6v9_ea2ee429-d7fa-4703-99bd-5d963ebab30c/kube-rbac-proxy/0.log" Apr 17 09:20:46.762722 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:46.762687 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w5vps_fdf7670d-1d61-4809-892c-ac96118b27f2/ovn-controller/0.log" Apr 17 09:20:46.784912 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:46.784883 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w5vps_fdf7670d-1d61-4809-892c-ac96118b27f2/ovn-acl-logging/0.log" Apr 17 09:20:46.808578 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:46.808555 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w5vps_fdf7670d-1d61-4809-892c-ac96118b27f2/kube-rbac-proxy-node/0.log" Apr 17 09:20:46.834694 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:46.834661 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w5vps_fdf7670d-1d61-4809-892c-ac96118b27f2/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 09:20:46.852697 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:46.852671 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w5vps_fdf7670d-1d61-4809-892c-ac96118b27f2/northd/0.log" Apr 17 09:20:46.875743 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:46.875699 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w5vps_fdf7670d-1d61-4809-892c-ac96118b27f2/nbdb/0.log" Apr 17 09:20:46.896876 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:46.896818 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w5vps_fdf7670d-1d61-4809-892c-ac96118b27f2/sbdb/0.log" Apr 17 09:20:47.052129 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:47.052051 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w5vps_fdf7670d-1d61-4809-892c-ac96118b27f2/ovnkube-controller/0.log" Apr 17 09:20:48.124314 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:48.124277 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-vptzp_555d9d60-af04-44d3-b6cc-9af0c1398acd/network-check-target-container/0.log" Apr 17 09:20:49.060034 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:49.060006 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-tm6nr_c58c74c9-b33b-45a4-b98a-2c99ab16bff9/iptables-alerter/0.log" Apr 17 09:20:49.683895 ip-10-0-130-147 kubenswrapper[2578]: I0417 09:20:49.683857 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-fffcg_1f61bc12-5108-467a-9c7c-fc6b4db52b69/tuned/0.log"