Apr 28 19:15:42.483160 ip-10-0-138-119 systemd[1]: Starting Kubernetes Kubelet... Apr 28 19:15:42.911561 ip-10-0-138-119 kubenswrapper[2565]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 28 19:15:42.911561 ip-10-0-138-119 kubenswrapper[2565]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 28 19:15:42.911561 ip-10-0-138-119 kubenswrapper[2565]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 28 19:15:42.911561 ip-10-0-138-119 kubenswrapper[2565]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 28 19:15:42.911561 ip-10-0-138-119 kubenswrapper[2565]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 28 19:15:42.914689 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.914612 2565 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 28 19:15:42.919759 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919736 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:15:42.919759 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919755 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:15:42.919759 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919761 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:15:42.919759 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919765 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:15:42.920023 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919769 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:15:42.920023 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919774 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:15:42.920023 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919778 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:15:42.920023 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919783 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:15:42.920023 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919787 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:15:42.920023 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919790 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:15:42.920023 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919794 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:15:42.920023 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919799 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:15:42.920023 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919803 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:15:42.920023 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919806 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:15:42.920023 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919812 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:15:42.920023 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919819 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:15:42.920023 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919827 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:15:42.920023 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919831 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:15:42.920023 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919835 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:15:42.920023 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919840 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:15:42.920023 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919844 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:15:42.920023 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919848 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:15:42.920023 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919852 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:15:42.920785 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919856 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:15:42.920785 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919859 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:15:42.920785 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919864 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:15:42.920785 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919868 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:15:42.920785 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919872 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:15:42.920785 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919875 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:15:42.920785 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919879 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:15:42.920785 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919884 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:15:42.920785 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919888 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:15:42.920785 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919892 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:15:42.920785 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919915 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:15:42.920785 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919919 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:15:42.920785 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919924 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:15:42.920785 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919928 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:15:42.920785 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919933 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:15:42.920785 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919938 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:15:42.920785 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919942 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:15:42.920785 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919946 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:15:42.920785 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919950 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:15:42.920785 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919954 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:15:42.921499 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919958 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:15:42.921499 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919962 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:15:42.921499 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919966 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:15:42.921499 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919971 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:15:42.921499 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919975 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:15:42.921499 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919979 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:15:42.921499 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919983 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:15:42.921499 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919987 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:15:42.921499 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919991 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:15:42.921499 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.919995 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:15:42.921499 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920000 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:15:42.921499 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920004 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:15:42.921499 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920009 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:15:42.921499 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920013 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:15:42.921499 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920017 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:15:42.921499 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920021 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:15:42.921499 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920025 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:15:42.921499 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920028 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:15:42.921499 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920032 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:15:42.921499 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920036 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:15:42.922031 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920041 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:15:42.922031 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920045 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:15:42.922031 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920049 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:15:42.922031 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920053 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:15:42.922031 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920057 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:15:42.922031 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920061 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:15:42.922031 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920066 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:15:42.922031 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920070 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:15:42.922031 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920074 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:15:42.922031 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920078 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:15:42.922031 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920084 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:15:42.922031 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920088 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:15:42.922031 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920092 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:15:42.922031 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920096 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:15:42.922031 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920100 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:15:42.922031 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920106 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:15:42.922031 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920110 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:15:42.922031 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920115 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:15:42.922031 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920119 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:15:42.922031 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920123 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:15:42.922612 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920127 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:15:42.922612 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920131 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:15:42.922612 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920136 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:15:42.922612 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920737 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:15:42.922612 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920746 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:15:42.922612 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920750 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:15:42.922612 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920755 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:15:42.922612 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920759 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:15:42.922612 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920763 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:15:42.922612 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920767 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:15:42.922612 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920771 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:15:42.922612 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920776 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:15:42.922612 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920780 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:15:42.922612 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920784 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:15:42.922612 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920787 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:15:42.922612 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920791 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:15:42.922612 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920795 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:15:42.922612 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920800 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:15:42.922612 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920804 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:15:42.922612 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920807 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:15:42.923476 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920812 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:15:42.923476 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920816 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:15:42.923476 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920820 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:15:42.923476 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920824 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:15:42.923476 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920829 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:15:42.923476 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920833 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:15:42.923476 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920837 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:15:42.923476 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920842 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:15:42.923476 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920847 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:15:42.923476 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920851 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:15:42.923476 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920855 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:15:42.923476 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920859 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:15:42.923476 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920863 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:15:42.923476 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920868 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:15:42.923476 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920872 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:15:42.923476 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920877 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:15:42.923476 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920882 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:15:42.923476 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920886 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:15:42.923476 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920890 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:15:42.923476 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920916 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:15:42.924163 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920921 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:15:42.924163 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920925 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:15:42.924163 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920929 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:15:42.924163 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920933 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:15:42.924163 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920937 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:15:42.924163 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920941 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:15:42.924163 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920946 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:15:42.924163 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920950 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:15:42.924163 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920955 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:15:42.924163 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920959 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:15:42.924163 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920973 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:15:42.924163 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920980 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:15:42.924163 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920985 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:15:42.924163 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920989 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:15:42.924163 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920993 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:15:42.924163 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.920997 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:15:42.924163 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.921002 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:15:42.924163 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.921006 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:15:42.924163 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.921010 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:15:42.924163 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.921014 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:15:42.924648 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.921020 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:15:42.924648 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.921024 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:15:42.924648 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.921028 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:15:42.924648 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.921032 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:15:42.924648 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.921036 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:15:42.924648 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.921044 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:15:42.924648 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.921050 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:15:42.924648 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.921057 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:15:42.924648 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.921062 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:15:42.924648 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.921067 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:15:42.924648 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.921072 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:15:42.924648 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.921077 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:15:42.924648 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.921081 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:15:42.924648 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.921085 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:15:42.924648 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.921090 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:15:42.924648 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.921095 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:15:42.924648 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.921099 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:15:42.924648 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.921104 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:15:42.924648 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.921109 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:15:42.925219 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.921113 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:15:42.925219 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.921117 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:15:42.925219 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.921121 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:15:42.925219 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.921125 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:15:42.925219 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.921129 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:15:42.925219 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.921135 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:15:42.925219 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.921140 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:15:42.925219 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.921144 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:15:42.925219 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.921148 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:15:42.925219 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.921152 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:15:42.925219 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922418 2565 flags.go:64] FLAG: --address="0.0.0.0" Apr 28 19:15:42.925219 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922434 2565 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 28 19:15:42.925219 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922444 2565 flags.go:64] FLAG: --anonymous-auth="true" Apr 28 19:15:42.925219 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922451 2565 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 28 19:15:42.925219 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922459 2565 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 28 19:15:42.925219 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922465 2565 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 28 19:15:42.925219 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922472 2565 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 28 19:15:42.925219 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922478 2565 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 28 19:15:42.925219 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922483 2565 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 28 19:15:42.925219 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922489 2565 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 28 19:15:42.925219 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922496 2565 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 28 19:15:42.925956 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922501 2565 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 28 19:15:42.925956 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922507 2565 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 28 19:15:42.925956 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922511 2565 flags.go:64] FLAG: --cgroup-root="" Apr 28 19:15:42.925956 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922516 2565 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 28 19:15:42.925956 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922521 2565 flags.go:64] FLAG: --client-ca-file="" Apr 28 19:15:42.925956 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922536 2565 flags.go:64] FLAG: --cloud-config="" Apr 28 19:15:42.925956 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922542 2565 flags.go:64] FLAG: --cloud-provider="external" Apr 28 19:15:42.925956 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922546 2565 flags.go:64] FLAG: --cluster-dns="[]" Apr 28 19:15:42.925956 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922553 2565 flags.go:64] FLAG: --cluster-domain="" Apr 28 19:15:42.925956 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922557 2565 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 28 19:15:42.925956 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922562 2565 flags.go:64] FLAG: --config-dir="" Apr 28 19:15:42.925956 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922567 2565 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 28 19:15:42.925956 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922572 2565 flags.go:64] FLAG: --container-log-max-files="5" Apr 28 19:15:42.925956 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922585 2565 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 28 19:15:42.925956 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922590 2565 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 28 19:15:42.925956 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922596 2565 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 28 19:15:42.925956 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922601 2565 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 28 19:15:42.925956 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922606 2565 flags.go:64] FLAG: --contention-profiling="false" Apr 28 19:15:42.925956 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922611 2565 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 28 19:15:42.925956 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922615 2565 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 28 19:15:42.925956 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922620 2565 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 28 19:15:42.925956 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922624 2565 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 28 19:15:42.925956 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922631 2565 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 28 19:15:42.925956 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922635 2565 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 28 19:15:42.925956 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922640 2565 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 28 19:15:42.926577 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922645 2565 flags.go:64] FLAG: --enable-load-reader="false" Apr 28 19:15:42.926577 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922651 2565 flags.go:64] FLAG: --enable-server="true" Apr 28 19:15:42.926577 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922656 2565 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 28 19:15:42.926577 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922663 2565 flags.go:64] FLAG: --event-burst="100" Apr 28 19:15:42.926577 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922669 2565 flags.go:64] FLAG: --event-qps="50" Apr 28 19:15:42.926577 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922673 2565 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 28 19:15:42.926577 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922678 2565 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 28 19:15:42.926577 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922683 2565 flags.go:64] FLAG: --eviction-hard="" Apr 28 19:15:42.926577 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922690 2565 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 28 19:15:42.926577 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922695 2565 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 28 19:15:42.926577 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922700 2565 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 28 19:15:42.926577 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922705 2565 flags.go:64] FLAG: --eviction-soft="" Apr 28 19:15:42.926577 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922710 2565 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 28 19:15:42.926577 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922714 2565 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 28 19:15:42.926577 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922719 2565 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 28 19:15:42.926577 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922723 2565 flags.go:64] FLAG: --experimental-mounter-path="" Apr 28 19:15:42.926577 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922728 2565 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 28 19:15:42.926577 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922733 2565 flags.go:64] FLAG: --fail-swap-on="true" Apr 28 19:15:42.926577 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922738 2565 flags.go:64] FLAG: --feature-gates="" Apr 28 19:15:42.926577 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922744 2565 flags.go:64] FLAG: --file-check-frequency="20s" Apr 28 19:15:42.926577 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922749 2565 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 28 19:15:42.926577 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922754 2565 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 28 19:15:42.926577 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922759 2565 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 28 19:15:42.926577 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922764 2565 flags.go:64] FLAG: --healthz-port="10248" Apr 28 19:15:42.926577 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922769 2565 flags.go:64] FLAG: --help="false" Apr 28 19:15:42.927278 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922774 2565 flags.go:64] FLAG: --hostname-override="ip-10-0-138-119.ec2.internal" Apr 28 19:15:42.927278 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922779 2565 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 28 19:15:42.927278 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922784 2565 flags.go:64] FLAG: --http-check-frequency="20s" Apr 28 19:15:42.927278 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922789 2565 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 28 19:15:42.927278 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922795 2565 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 28 19:15:42.927278 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922800 2565 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 28 19:15:42.927278 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922805 2565 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 28 19:15:42.927278 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922809 2565 flags.go:64] FLAG: --image-service-endpoint="" Apr 28 19:15:42.927278 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922814 2565 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 28 19:15:42.927278 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922819 2565 flags.go:64] FLAG: --kube-api-burst="100" Apr 28 19:15:42.927278 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922824 2565 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 28 19:15:42.927278 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922830 2565 flags.go:64] FLAG: --kube-api-qps="50" Apr 28 19:15:42.927278 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922835 2565 flags.go:64] FLAG: --kube-reserved="" Apr 28 19:15:42.927278 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922841 2565 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 28 19:15:42.927278 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922846 2565 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 28 19:15:42.927278 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922851 2565 flags.go:64] FLAG: --kubelet-cgroups="" Apr 28 19:15:42.927278 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922856 2565 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 28 19:15:42.927278 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922860 2565 flags.go:64] FLAG: --lock-file="" Apr 28 19:15:42.927278 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922865 2565 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 28 19:15:42.927278 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922870 2565 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 28 19:15:42.927278 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922875 2565 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 28 19:15:42.927278 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922883 2565 flags.go:64] FLAG: --log-json-split-stream="false" Apr 28 19:15:42.927278 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922888 2565 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 28 19:15:42.927853 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922908 2565 flags.go:64] FLAG: --log-text-split-stream="false" Apr 28 19:15:42.927853 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922914 2565 flags.go:64] FLAG: --logging-format="text" Apr 28 19:15:42.927853 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922919 2565 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 28 19:15:42.927853 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922925 2565 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 28 19:15:42.927853 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922929 2565 flags.go:64] FLAG: --manifest-url="" Apr 28 19:15:42.927853 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922933 2565 flags.go:64] FLAG: --manifest-url-header="" Apr 28 19:15:42.927853 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922940 2565 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 28 19:15:42.927853 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922945 2565 flags.go:64] FLAG: --max-open-files="1000000" Apr 28 19:15:42.927853 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922952 2565 flags.go:64] FLAG: --max-pods="110" Apr 28 19:15:42.927853 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922956 2565 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 28 19:15:42.927853 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922961 2565 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 28 19:15:42.927853 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922966 2565 flags.go:64] FLAG: --memory-manager-policy="None" Apr 28 19:15:42.927853 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922971 2565 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 28 19:15:42.927853 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922976 2565 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 28 19:15:42.927853 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922981 2565 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 28 19:15:42.927853 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922986 2565 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 28 19:15:42.927853 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.922998 2565 flags.go:64] FLAG: --node-status-max-images="50" Apr 28 19:15:42.927853 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923002 2565 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 28 19:15:42.927853 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923007 2565 flags.go:64] FLAG: --oom-score-adj="-999" Apr 28 19:15:42.927853 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923013 2565 flags.go:64] FLAG: --pod-cidr="" Apr 28 19:15:42.927853 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923018 2565 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 28 19:15:42.927853 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923027 2565 flags.go:64] FLAG: --pod-manifest-path="" Apr 28 19:15:42.927853 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923031 2565 flags.go:64] FLAG: --pod-max-pids="-1" Apr 28 19:15:42.927853 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923036 2565 flags.go:64] FLAG: --pods-per-core="0" Apr 28 19:15:42.928442 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923041 2565 flags.go:64] FLAG: --port="10250" Apr 28 19:15:42.928442 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923046 2565 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 28 19:15:42.928442 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923051 2565 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-004d2191da6ad7a74" Apr 28 19:15:42.928442 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923056 2565 flags.go:64] FLAG: --qos-reserved="" Apr 28 19:15:42.928442 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923061 2565 flags.go:64] FLAG: --read-only-port="10255" Apr 28 19:15:42.928442 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923066 2565 flags.go:64] FLAG: --register-node="true" Apr 28 19:15:42.928442 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923071 2565 flags.go:64] FLAG: --register-schedulable="true" Apr 28 19:15:42.928442 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923076 2565 flags.go:64] FLAG: --register-with-taints="" Apr 28 19:15:42.928442 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923081 2565 flags.go:64] FLAG: --registry-burst="10" Apr 28 19:15:42.928442 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923086 2565 flags.go:64] FLAG: --registry-qps="5" Apr 28 19:15:42.928442 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923090 2565 flags.go:64] FLAG: --reserved-cpus="" Apr 28 19:15:42.928442 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923095 2565 flags.go:64] FLAG: --reserved-memory="" Apr 28 19:15:42.928442 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923102 2565 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 28 19:15:42.928442 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923107 2565 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 28 19:15:42.928442 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923111 2565 flags.go:64] FLAG: --rotate-certificates="false" Apr 28 19:15:42.928442 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923116 2565 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 28 19:15:42.928442 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923120 2565 flags.go:64] FLAG: --runonce="false" Apr 28 19:15:42.928442 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923125 2565 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 28 19:15:42.928442 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923130 2565 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 28 19:15:42.928442 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923135 2565 flags.go:64] FLAG: --seccomp-default="false" Apr 28 19:15:42.928442 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923139 2565 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 28 19:15:42.928442 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923144 2565 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 28 19:15:42.928442 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923149 2565 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 28 19:15:42.928442 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923154 2565 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 28 19:15:42.928442 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923159 2565 flags.go:64] FLAG: --storage-driver-password="root" Apr 28 19:15:42.928442 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923164 2565 flags.go:64] FLAG: --storage-driver-secure="false" Apr 28 19:15:42.929060 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923168 2565 flags.go:64] FLAG: --storage-driver-table="stats" Apr 28 19:15:42.929060 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923173 2565 flags.go:64] FLAG: --storage-driver-user="root" Apr 28 19:15:42.929060 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923179 2565 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 28 19:15:42.929060 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923187 2565 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 28 19:15:42.929060 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923192 2565 flags.go:64] FLAG: --system-cgroups="" Apr 28 19:15:42.929060 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923196 2565 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 28 19:15:42.929060 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923205 2565 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 28 19:15:42.929060 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923209 2565 flags.go:64] FLAG: --tls-cert-file="" Apr 28 19:15:42.929060 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923214 2565 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 28 19:15:42.929060 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923220 2565 flags.go:64] FLAG: --tls-min-version="" Apr 28 19:15:42.929060 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923225 2565 flags.go:64] FLAG: --tls-private-key-file="" Apr 28 19:15:42.929060 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923230 2565 flags.go:64] FLAG: --topology-manager-policy="none" Apr 28 19:15:42.929060 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923234 2565 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 28 19:15:42.929060 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923239 2565 flags.go:64] FLAG: --topology-manager-scope="container" Apr 28 19:15:42.929060 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923244 2565 flags.go:64] FLAG: --v="2" Apr 28 19:15:42.929060 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923250 2565 flags.go:64] FLAG: --version="false" Apr 28 19:15:42.929060 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923257 2565 flags.go:64] FLAG: --vmodule="" Apr 28 19:15:42.929060 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923264 2565 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 28 19:15:42.929060 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.923269 2565 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 28 19:15:42.929060 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923411 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:15:42.929060 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923417 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:15:42.929060 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923422 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:15:42.929060 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923427 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:15:42.929060 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923432 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:15:42.929698 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923437 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:15:42.929698 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923441 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:15:42.929698 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923446 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:15:42.929698 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923451 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:15:42.929698 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923455 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:15:42.929698 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923459 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:15:42.929698 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923464 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:15:42.929698 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923471 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:15:42.929698 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923475 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:15:42.929698 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923479 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:15:42.929698 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923483 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:15:42.929698 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923490 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:15:42.929698 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923494 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:15:42.929698 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923498 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:15:42.929698 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923502 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:15:42.929698 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923506 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:15:42.929698 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923511 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:15:42.929698 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923516 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:15:42.929698 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923520 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:15:42.929698 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923525 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:15:42.930300 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923529 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:15:42.930300 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923533 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:15:42.930300 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923537 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:15:42.930300 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923541 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:15:42.930300 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923545 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:15:42.930300 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923551 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:15:42.930300 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923558 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:15:42.930300 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923563 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:15:42.930300 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923567 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:15:42.930300 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923571 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:15:42.930300 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923575 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:15:42.930300 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923579 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:15:42.930300 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923583 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:15:42.930300 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923587 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:15:42.930300 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923591 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:15:42.930300 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923595 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:15:42.930300 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923599 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:15:42.930300 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923604 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:15:42.930300 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923608 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:15:42.930300 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923612 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:15:42.930797 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923616 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:15:42.930797 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923621 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:15:42.930797 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923626 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:15:42.930797 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923631 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:15:42.930797 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923636 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:15:42.930797 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923642 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:15:42.930797 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923648 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:15:42.930797 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923653 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:15:42.930797 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923658 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:15:42.930797 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923662 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:15:42.930797 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923668 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:15:42.930797 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923672 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:15:42.930797 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923676 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:15:42.930797 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923680 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:15:42.930797 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923683 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:15:42.930797 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923688 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:15:42.930797 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923692 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:15:42.930797 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923696 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:15:42.930797 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923701 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:15:42.931340 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923705 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:15:42.931340 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923710 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:15:42.931340 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923714 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:15:42.931340 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923718 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:15:42.931340 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923722 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:15:42.931340 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923726 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:15:42.931340 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923730 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:15:42.931340 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923734 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:15:42.931340 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923739 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:15:42.931340 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923743 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:15:42.931340 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923747 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:15:42.931340 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923751 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:15:42.931340 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923755 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:15:42.931340 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923759 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:15:42.931340 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923763 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:15:42.931340 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923767 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:15:42.931340 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923773 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:15:42.931340 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923777 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:15:42.931340 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923782 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:15:42.931340 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923786 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:15:42.931340 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923790 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:15:42.931858 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.923794 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:15:42.931858 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.924764 2565 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 28 19:15:42.931858 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.930929 2565 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 28 19:15:42.931858 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.930946 2565 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 28 19:15:42.931858 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.930991 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:15:42.931858 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.930996 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:15:42.931858 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931000 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:15:42.931858 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931002 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:15:42.931858 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931005 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:15:42.931858 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931008 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:15:42.931858 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931011 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:15:42.931858 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931014 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:15:42.931858 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931016 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:15:42.931858 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931019 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:15:42.931858 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931021 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:15:42.931858 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931024 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:15:42.932286 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931027 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:15:42.932286 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931029 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:15:42.932286 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931032 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:15:42.932286 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931034 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:15:42.932286 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931037 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:15:42.932286 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931041 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:15:42.932286 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931045 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:15:42.932286 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931048 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:15:42.932286 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931051 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:15:42.932286 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931054 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:15:42.932286 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931057 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:15:42.932286 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931059 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:15:42.932286 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931062 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:15:42.932286 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931065 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:15:42.932286 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931068 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:15:42.932286 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931071 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:15:42.932286 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931073 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:15:42.932286 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931076 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:15:42.932286 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931078 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:15:42.932787 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931082 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:15:42.932787 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931084 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:15:42.932787 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931087 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:15:42.932787 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931090 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:15:42.932787 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931093 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:15:42.932787 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931095 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:15:42.932787 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931098 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:15:42.932787 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931101 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:15:42.932787 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931103 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:15:42.932787 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931106 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:15:42.932787 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931109 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:15:42.932787 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931111 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:15:42.932787 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931114 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:15:42.932787 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931117 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:15:42.932787 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931120 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:15:42.932787 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931122 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:15:42.932787 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931125 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:15:42.932787 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931127 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:15:42.932787 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931130 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:15:42.932787 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931132 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:15:42.933278 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931135 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:15:42.933278 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931137 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:15:42.933278 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931142 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:15:42.933278 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931145 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:15:42.933278 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931148 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:15:42.933278 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931150 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:15:42.933278 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931153 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:15:42.933278 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931156 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:15:42.933278 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931159 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:15:42.933278 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931161 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:15:42.933278 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931163 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:15:42.933278 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931166 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:15:42.933278 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931169 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:15:42.933278 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931173 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:15:42.933278 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931175 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:15:42.933278 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931178 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:15:42.933278 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931180 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:15:42.933278 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931183 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:15:42.933278 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931185 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:15:42.933278 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931188 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:15:42.933751 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931190 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:15:42.933751 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931193 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:15:42.933751 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931195 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:15:42.933751 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931198 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:15:42.933751 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931200 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:15:42.933751 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931202 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:15:42.933751 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931205 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:15:42.933751 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931208 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:15:42.933751 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931210 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:15:42.933751 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931213 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:15:42.933751 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931216 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:15:42.933751 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931218 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:15:42.933751 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931221 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:15:42.933751 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931223 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:15:42.933751 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931226 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:15:42.934133 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.931231 2565 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 28 19:15:42.934133 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931321 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:15:42.934133 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931325 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:15:42.934133 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931328 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:15:42.934133 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931332 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:15:42.934133 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931336 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:15:42.934133 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931339 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:15:42.934133 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931343 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:15:42.934133 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931346 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:15:42.934133 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931349 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:15:42.934133 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931353 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:15:42.934133 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931355 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:15:42.934133 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931358 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:15:42.934133 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931361 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:15:42.934133 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931363 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:15:42.934509 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931366 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:15:42.934509 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931368 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:15:42.934509 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931371 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:15:42.934509 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931373 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:15:42.934509 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931376 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:15:42.934509 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931378 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:15:42.934509 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931381 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:15:42.934509 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931383 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:15:42.934509 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931386 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:15:42.934509 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931389 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:15:42.934509 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931391 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:15:42.934509 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931394 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:15:42.934509 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931396 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:15:42.934509 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931399 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:15:42.934509 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931402 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:15:42.934509 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931404 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:15:42.934509 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931406 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:15:42.934509 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931409 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:15:42.934509 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931411 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:15:42.934509 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931414 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:15:42.935011 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931416 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:15:42.935011 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931418 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:15:42.935011 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931421 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:15:42.935011 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931424 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:15:42.935011 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931426 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:15:42.935011 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931429 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:15:42.935011 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931433 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:15:42.935011 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931437 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:15:42.935011 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931440 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:15:42.935011 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931443 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:15:42.935011 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931446 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:15:42.935011 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931449 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:15:42.935011 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931452 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:15:42.935011 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931455 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:15:42.935011 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931457 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:15:42.935011 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931460 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:15:42.935011 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931462 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:15:42.935011 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931465 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:15:42.935011 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931468 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:15:42.935484 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931470 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:15:42.935484 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931473 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:15:42.935484 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931475 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:15:42.935484 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931478 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:15:42.935484 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931480 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:15:42.935484 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931482 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:15:42.935484 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931485 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:15:42.935484 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931488 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:15:42.935484 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931490 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:15:42.935484 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931493 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:15:42.935484 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931495 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:15:42.935484 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931497 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:15:42.935484 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931500 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:15:42.935484 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931502 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:15:42.935484 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931505 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:15:42.935484 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931507 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:15:42.935484 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931509 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:15:42.935484 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931513 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:15:42.935484 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931515 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:15:42.935484 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931518 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:15:42.935984 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931521 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:15:42.935984 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931523 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:15:42.935984 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931526 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:15:42.935984 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931529 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:15:42.935984 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931531 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:15:42.935984 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931534 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:15:42.935984 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931537 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:15:42.935984 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931539 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:15:42.935984 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931542 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:15:42.935984 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931544 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:15:42.935984 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931547 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:15:42.935984 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931549 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:15:42.935984 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:42.931552 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:15:42.935984 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.931557 2565 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 28 19:15:42.935984 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.932237 2565 server.go:962] "Client rotation is on, will bootstrap in background" Apr 28 19:15:42.936355 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.934930 2565 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 28 19:15:42.936355 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.935916 2565 server.go:1019] "Starting client certificate rotation" Apr 28 19:15:42.936355 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.936016 2565 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 28 19:15:42.936355 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.936058 2565 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 28 19:15:42.962209 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.962192 2565 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 28 19:15:42.964478 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.964462 2565 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 28 19:15:42.977879 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.977864 2565 log.go:25] "Validated CRI v1 runtime API" Apr 28 19:15:42.982824 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.982811 2565 log.go:25] "Validated CRI v1 image API" Apr 28 19:15:42.983944 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.983928 2565 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 28 19:15:42.988450 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.988431 2565 fs.go:135] Filesystem UUIDs: map[0b0d3e72-570c-41d8-b9e8-6c21cc8d8780:/dev/nvme0n1p4 764a46c2-78b1-4353-ba4d-0ce9d01b17c3:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 28 19:15:42.988523 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.988450 2565 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 28 19:15:42.995856 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.995744 2565 manager.go:217] Machine: {Timestamp:2026-04-28 19:15:42.992447488 +0000 UTC m=+0.392192558 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099433 MemoryCapacity:33164484608 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2294d936f2a4b674200df327845a24 SystemUUID:ec2294d9-36f2-a4b6-7420-0df327845a24 BootID:b567c019-9496-4e43-a458-a5fa1321863c Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:17:c3:2c:8d:51 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:17:c3:2c:8d:51 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:b6:1f:e5:81:0f:81 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164484608 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 28 19:15:42.995856 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.995843 2565 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 28 19:15:42.996017 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.995928 2565 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 28 19:15:42.996978 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.996954 2565 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 28 19:15:42.997108 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.996981 2565 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-119.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 28 19:15:42.997151 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.997117 2565 topology_manager.go:138] "Creating topology manager with none policy" Apr 28 19:15:42.997151 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.997126 2565 container_manager_linux.go:306] "Creating device plugin manager" Apr 28 19:15:42.997151 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.997139 2565 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 28 19:15:42.998348 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.998332 2565 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 28 19:15:42.998518 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.998506 2565 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 28 19:15:42.999302 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.999292 2565 state_mem.go:36] "Initialized new in-memory state store" Apr 28 19:15:42.999400 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:42.999392 2565 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 28 19:15:43.001521 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.001511 2565 kubelet.go:491] "Attempting to sync node with API server" Apr 28 19:15:43.001566 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.001530 2565 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 28 19:15:43.001566 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.001542 2565 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 28 19:15:43.001566 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.001551 2565 kubelet.go:397] "Adding apiserver pod source" Apr 28 19:15:43.001566 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.001560 2565 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 28 19:15:43.002554 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.002543 2565 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 28 19:15:43.002599 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.002561 2565 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 28 19:15:43.005146 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.005131 2565 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 28 19:15:43.006504 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.006488 2565 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 28 19:15:43.007924 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.007910 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 28 19:15:43.007994 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.007932 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 28 19:15:43.007994 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.007941 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 28 19:15:43.007994 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.007949 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 28 19:15:43.007994 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.007958 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 28 19:15:43.007994 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.007967 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 28 19:15:43.007994 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.007976 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 28 19:15:43.007994 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.007986 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 28 19:15:43.007994 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.007995 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 28 19:15:43.008243 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.008004 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 28 19:15:43.008243 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.008016 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 28 19:15:43.008243 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.008029 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 28 19:15:43.008880 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.008869 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 28 19:15:43.008946 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.008883 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 28 19:15:43.012463 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.012449 2565 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 28 19:15:43.012547 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.012490 2565 server.go:1295] "Started kubelet" Apr 28 19:15:43.012636 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.012585 2565 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 28 19:15:43.012677 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.012632 2565 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 28 19:15:43.012715 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.012679 2565 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 28 19:15:43.013349 ip-10-0-138-119 systemd[1]: Started Kubernetes Kubelet. Apr 28 19:15:43.016356 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.016319 2565 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 28 19:15:43.017040 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.017022 2565 server.go:317] "Adding debug handlers to kubelet server" Apr 28 19:15:43.017701 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.017673 2565 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-138-119.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 28 19:15:43.018165 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:43.018140 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 28 19:15:43.018228 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:43.018143 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-138-119.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 28 19:15:43.022847 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.022827 2565 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 28 19:15:43.022957 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.022852 2565 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 28 19:15:43.023491 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.023472 2565 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 28 19:15:43.023491 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.023473 2565 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 28 19:15:43.023620 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.023497 2565 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 28 19:15:43.023620 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.023546 2565 reconstruct.go:97] "Volume reconstruction finished" Apr 28 19:15:43.023620 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.023557 2565 reconciler.go:26] "Reconciler: start to sync state" Apr 28 19:15:43.023620 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:43.023615 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-119.ec2.internal\" not found" Apr 28 19:15:43.025302 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.025284 2565 factory.go:55] Registering systemd factory Apr 28 19:15:43.025302 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.025306 2565 factory.go:223] Registration of the systemd container factory successfully Apr 28 19:15:43.025602 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.025575 2565 factory.go:153] Registering CRI-O factory Apr 28 19:15:43.025602 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.025595 2565 factory.go:223] Registration of the crio container factory successfully Apr 28 19:15:43.025741 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.025650 2565 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 28 19:15:43.025741 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.025675 2565 factory.go:103] Registering Raw factory Apr 28 19:15:43.025741 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.025690 2565 manager.go:1196] Started watching for new ooms in manager Apr 28 19:15:43.026130 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.026116 2565 manager.go:319] Starting recovery of all containers Apr 28 19:15:43.026424 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:43.026399 2565 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 28 19:15:43.030885 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:43.030857 2565 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-138-119.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 28 19:15:43.031099 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:43.030204 2565 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-119.ec2.internal.18aa9b4984b51a1b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-119.ec2.internal,UID:ip-10-0-138-119.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-138-119.ec2.internal,},FirstTimestamp:2026-04-28 19:15:43.012461083 +0000 UTC m=+0.412206154,LastTimestamp:2026-04-28 19:15:43.012461083 +0000 UTC m=+0.412206154,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-119.ec2.internal,}" Apr 28 19:15:43.031811 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:43.031784 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 28 19:15:43.036672 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.036656 2565 manager.go:324] Recovery completed Apr 28 19:15:43.038199 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:43.038176 2565 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 28 19:15:43.041032 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.041020 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:15:43.044356 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.044341 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-119.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:15:43.044413 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.044370 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-119.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:15:43.044413 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.044385 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-119.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:15:43.044891 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.044876 2565 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 28 19:15:43.044969 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.044891 2565 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 28 19:15:43.044969 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.044924 2565 state_mem.go:36] "Initialized new in-memory state store" Apr 28 19:15:43.046799 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:43.046735 2565 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-119.ec2.internal.18aa9b49869bc8be default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-119.ec2.internal,UID:ip-10-0-138-119.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-138-119.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-138-119.ec2.internal,},FirstTimestamp:2026-04-28 19:15:43.044356286 +0000 UTC m=+0.444101359,LastTimestamp:2026-04-28 19:15:43.044356286 +0000 UTC m=+0.444101359,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-119.ec2.internal,}" Apr 28 19:15:43.047114 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.047102 2565 policy_none.go:49] "None policy: Start" Apr 28 19:15:43.047172 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.047119 2565 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 28 19:15:43.047172 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.047128 2565 state_mem.go:35] "Initializing new in-memory state store" Apr 28 19:15:43.056131 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.056114 2565 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-clm2m" Apr 28 19:15:43.060153 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:43.060090 2565 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-119.ec2.internal.18aa9b49869c1cfa default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-119.ec2.internal,UID:ip-10-0-138-119.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-138-119.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-138-119.ec2.internal,},FirstTimestamp:2026-04-28 19:15:43.04437785 +0000 UTC m=+0.444122920,LastTimestamp:2026-04-28 19:15:43.04437785 +0000 UTC m=+0.444122920,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-119.ec2.internal,}" Apr 28 19:15:43.067233 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.067213 2565 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-clm2m" Apr 28 19:15:43.089438 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.089423 2565 manager.go:341] "Starting Device Plugin manager" Apr 28 19:15:43.089555 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:43.089470 2565 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 28 19:15:43.089555 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.089484 2565 server.go:85] "Starting device plugin registration server" Apr 28 19:15:43.089718 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.089706 2565 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 28 19:15:43.089772 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.089720 2565 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 28 19:15:43.089836 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.089817 2565 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 28 19:15:43.089944 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.089924 2565 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 28 19:15:43.089944 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.089940 2565 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 28 19:15:43.090682 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:43.090662 2565 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 28 19:15:43.090784 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:43.090701 2565 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-119.ec2.internal\" not found" Apr 28 19:15:43.118225 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.118196 2565 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 28 19:15:43.119343 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.119327 2565 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 28 19:15:43.119419 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.119355 2565 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 28 19:15:43.119419 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.119375 2565 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 28 19:15:43.119419 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.119385 2565 kubelet.go:2451] "Starting kubelet main sync loop" Apr 28 19:15:43.119558 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:43.119455 2565 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 28 19:15:43.122547 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.122522 2565 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:15:43.190005 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.189940 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:15:43.190752 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.190738 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-119.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:15:43.190804 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.190767 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-119.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:15:43.190804 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.190777 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-119.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:15:43.190804 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.190800 2565 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-119.ec2.internal" Apr 28 19:15:43.197983 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.197967 2565 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-119.ec2.internal" Apr 28 19:15:43.198062 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:43.197988 2565 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-138-119.ec2.internal\": node \"ip-10-0-138-119.ec2.internal\" not found" Apr 28 19:15:43.219147 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:43.219126 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-119.ec2.internal\" not found" Apr 28 19:15:43.220246 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.220223 2565 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-138-119.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-119.ec2.internal"] Apr 28 19:15:43.220293 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.220283 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:15:43.221691 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.221676 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-119.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:15:43.221766 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.221705 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-119.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:15:43.221766 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.221721 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-119.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:15:43.222874 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.222859 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:15:43.223020 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.223004 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-119.ec2.internal" Apr 28 19:15:43.223066 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.223034 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:15:43.223587 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.223572 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-119.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:15:43.223638 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.223601 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-119.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:15:43.223638 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.223614 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-119.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:15:43.223721 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.223675 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-119.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:15:43.223721 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.223695 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-119.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:15:43.223721 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.223708 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-119.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:15:43.224358 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.224342 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/9d1685eafe28745f79356d1935bdc8f9-config\") pod \"kube-apiserver-proxy-ip-10-0-138-119.ec2.internal\" (UID: \"9d1685eafe28745f79356d1935bdc8f9\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-119.ec2.internal" Apr 28 19:15:43.224414 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.224367 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8e7ef4154e4967cb8b4fb00da865259a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-119.ec2.internal\" (UID: \"8e7ef4154e4967cb8b4fb00da865259a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-119.ec2.internal" Apr 28 19:15:43.224414 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.224387 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8e7ef4154e4967cb8b4fb00da865259a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-119.ec2.internal\" (UID: \"8e7ef4154e4967cb8b4fb00da865259a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-119.ec2.internal" Apr 28 19:15:43.224858 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.224843 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-119.ec2.internal" Apr 28 19:15:43.224915 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.224877 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:15:43.225670 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.225655 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-119.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:15:43.225740 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.225680 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-119.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:15:43.225740 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.225695 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-119.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:15:43.254575 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:43.254552 2565 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-119.ec2.internal\" not found" node="ip-10-0-138-119.ec2.internal" Apr 28 19:15:43.257810 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:43.257790 2565 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-119.ec2.internal\" not found" node="ip-10-0-138-119.ec2.internal" Apr 28 19:15:43.319474 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:43.319456 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-119.ec2.internal\" not found" Apr 28 19:15:43.324754 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.324739 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/9d1685eafe28745f79356d1935bdc8f9-config\") pod \"kube-apiserver-proxy-ip-10-0-138-119.ec2.internal\" (UID: \"9d1685eafe28745f79356d1935bdc8f9\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-119.ec2.internal" Apr 28 19:15:43.324799 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.324765 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8e7ef4154e4967cb8b4fb00da865259a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-119.ec2.internal\" (UID: \"8e7ef4154e4967cb8b4fb00da865259a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-119.ec2.internal" Apr 28 19:15:43.324799 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.324783 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8e7ef4154e4967cb8b4fb00da865259a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-119.ec2.internal\" (UID: \"8e7ef4154e4967cb8b4fb00da865259a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-119.ec2.internal" Apr 28 19:15:43.324861 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.324844 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/9d1685eafe28745f79356d1935bdc8f9-config\") pod \"kube-apiserver-proxy-ip-10-0-138-119.ec2.internal\" (UID: \"9d1685eafe28745f79356d1935bdc8f9\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-119.ec2.internal" Apr 28 19:15:43.324861 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.324850 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8e7ef4154e4967cb8b4fb00da865259a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-119.ec2.internal\" (UID: \"8e7ef4154e4967cb8b4fb00da865259a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-119.ec2.internal" Apr 28 19:15:43.324950 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.324857 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8e7ef4154e4967cb8b4fb00da865259a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-119.ec2.internal\" (UID: \"8e7ef4154e4967cb8b4fb00da865259a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-119.ec2.internal" Apr 28 19:15:43.419677 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:43.419657 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-119.ec2.internal\" not found" Apr 28 19:15:43.520349 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:43.520294 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-119.ec2.internal\" not found" Apr 28 19:15:43.556777 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.556760 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-119.ec2.internal" Apr 28 19:15:43.560315 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.560299 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-119.ec2.internal" Apr 28 19:15:43.620910 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:43.620875 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-119.ec2.internal\" not found" Apr 28 19:15:43.721365 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:43.721341 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-119.ec2.internal\" not found" Apr 28 19:15:43.821849 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:43.821793 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-119.ec2.internal\" not found" Apr 28 19:15:43.922382 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:43.922358 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-119.ec2.internal\" not found" Apr 28 19:15:43.935785 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.935765 2565 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 28 19:15:43.935910 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:43.935882 2565 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 28 19:15:44.022700 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:44.022676 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-119.ec2.internal\" not found" Apr 28 19:15:44.023032 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:44.023019 2565 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 28 19:15:44.043973 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:44.043954 2565 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 28 19:15:44.070109 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:44.070070 2565 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-27 19:10:43 +0000 UTC" deadline="2027-09-22 18:16:46.123116931 +0000 UTC" Apr 28 19:15:44.070109 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:44.070106 2565 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12287h1m2.05301573s" Apr 28 19:15:44.084037 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:44.083877 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e7ef4154e4967cb8b4fb00da865259a.slice/crio-1a0b2bfe025c7058437aa063a974e9d762de29f19805aed042906d5cbe8227d8 WatchSource:0}: Error finding container 1a0b2bfe025c7058437aa063a974e9d762de29f19805aed042906d5cbe8227d8: Status 404 returned error can't find the container with id 1a0b2bfe025c7058437aa063a974e9d762de29f19805aed042906d5cbe8227d8 Apr 28 19:15:44.084304 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:44.084279 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d1685eafe28745f79356d1935bdc8f9.slice/crio-372e20286ddc40af56fd90eb275b114f8d11400a5bf16c3ff12497fbe57fd81c WatchSource:0}: Error finding container 372e20286ddc40af56fd90eb275b114f8d11400a5bf16c3ff12497fbe57fd81c: Status 404 returned error can't find the container with id 372e20286ddc40af56fd90eb275b114f8d11400a5bf16c3ff12497fbe57fd81c Apr 28 19:15:44.087760 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:44.087746 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:15:44.090291 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:44.090272 2565 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-wz24w" Apr 28 19:15:44.101248 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:44.101230 2565 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-wz24w" Apr 28 19:15:44.121754 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:44.121712 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-119.ec2.internal" event={"ID":"9d1685eafe28745f79356d1935bdc8f9","Type":"ContainerStarted","Data":"372e20286ddc40af56fd90eb275b114f8d11400a5bf16c3ff12497fbe57fd81c"} Apr 28 19:15:44.122664 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:44.122641 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-119.ec2.internal" event={"ID":"8e7ef4154e4967cb8b4fb00da865259a","Type":"ContainerStarted","Data":"1a0b2bfe025c7058437aa063a974e9d762de29f19805aed042906d5cbe8227d8"} Apr 28 19:15:44.123763 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:44.123745 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-119.ec2.internal\" not found" Apr 28 19:15:44.198249 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:44.198230 2565 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:15:44.224741 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:44.224720 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-119.ec2.internal\" not found" Apr 28 19:15:44.325173 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:44.325145 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-119.ec2.internal\" not found" Apr 28 19:15:44.368696 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:44.368652 2565 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:15:44.423048 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:44.423027 2565 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-119.ec2.internal" Apr 28 19:15:44.437132 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:44.437113 2565 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 28 19:15:44.437953 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:44.437941 2565 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-119.ec2.internal" Apr 28 19:15:44.468042 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:44.467951 2565 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 28 19:15:44.504254 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:44.504226 2565 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:15:45.002857 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.002831 2565 apiserver.go:52] "Watching apiserver" Apr 28 19:15:45.010534 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.010512 2565 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 28 19:15:45.013078 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.013050 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-dn4qh","kube-system/kube-apiserver-proxy-ip-10-0-138-119.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-h22m8","openshift-cluster-node-tuning-operator/tuned-2cht2","openshift-image-registry/node-ca-clds8","openshift-multus/multus-additional-cni-plugins-ls4b6","openshift-multus/multus-pps95","openshift-network-diagnostics/network-check-target-8gbf7","openshift-dns/node-resolver-fqkh6","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-119.ec2.internal","openshift-multus/network-metrics-daemon-88gvq","openshift-network-operator/iptables-alerter-tntl8","openshift-ovn-kubernetes/ovnkube-node-d6twb"] Apr 28 19:15:45.014995 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.014970 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pps95" Apr 28 19:15:45.017134 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.017098 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 28 19:15:45.017280 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.017250 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 28 19:15:45.017351 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.017308 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-h22m8" Apr 28 19:15:45.017779 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.017755 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 28 19:15:45.017956 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.017938 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 28 19:15:45.018061 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.017990 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-n7tb2\"" Apr 28 19:15:45.018382 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.018281 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-2cht2" Apr 28 19:15:45.018475 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.018411 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-clds8" Apr 28 19:15:45.019289 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.019222 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 28 19:15:45.019595 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.019575 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 28 19:15:45.019942 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.019892 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 28 19:15:45.020317 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.020297 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-bf2sf\"" Apr 28 19:15:45.020582 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.020447 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 28 19:15:45.021200 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.021182 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 28 19:15:45.021609 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.021370 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-dn4qh" Apr 28 19:15:45.021609 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.021413 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-kbrhp\"" Apr 28 19:15:45.021609 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.021609 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 28 19:15:45.022165 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.021850 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 28 19:15:45.022165 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.021960 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-z7lvq\"" Apr 28 19:15:45.022165 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.022110 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 28 19:15:45.022701 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.022680 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8gbf7" Apr 28 19:15:45.022799 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:45.022753 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8gbf7" podUID="40262fc5-4234-4213-8739-f1ff807f34ec" Apr 28 19:15:45.024491 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.024196 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fqkh6" Apr 28 19:15:45.025240 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.024859 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-swf9m\"" Apr 28 19:15:45.025240 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.025140 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 28 19:15:45.025240 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.025239 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 28 19:15:45.025701 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.025680 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ls4b6" Apr 28 19:15:45.025869 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.025853 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-88gvq" Apr 28 19:15:45.025977 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:45.025948 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-88gvq" podUID="09653a58-e44e-4fb5-a021-58bc08a4765f" Apr 28 19:15:45.026397 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.026373 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 28 19:15:45.026523 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.026501 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-w6sdf\"" Apr 28 19:15:45.026883 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.026866 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 28 19:15:45.027198 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.027181 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-tntl8" Apr 28 19:15:45.028066 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.028047 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 28 19:15:45.032936 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.028322 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 28 19:15:45.032936 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.029791 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-9bdgz\"" Apr 28 19:15:45.032936 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.030299 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 28 19:15:45.032936 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.030858 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 28 19:15:45.032936 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.031159 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.032936 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.031220 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 28 19:15:45.032936 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.031451 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-hvftn\"" Apr 28 19:15:45.033822 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.033788 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/023534f3-e54d-45bb-b99b-12a35302ae01-multus-conf-dir\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.033927 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.033836 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d27f468f-a5ab-460e-8afc-5ff534c369dc-etc-modprobe-d\") pod \"tuned-2cht2\" (UID: \"d27f468f-a5ab-460e-8afc-5ff534c369dc\") " pod="openshift-cluster-node-tuning-operator/tuned-2cht2" Apr 28 19:15:45.033927 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.033870 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d27f468f-a5ab-460e-8afc-5ff534c369dc-etc-sysconfig\") pod \"tuned-2cht2\" (UID: \"d27f468f-a5ab-460e-8afc-5ff534c369dc\") " pod="openshift-cluster-node-tuning-operator/tuned-2cht2" Apr 28 19:15:45.034088 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.033991 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d27f468f-a5ab-460e-8afc-5ff534c369dc-lib-modules\") pod \"tuned-2cht2\" (UID: \"d27f468f-a5ab-460e-8afc-5ff534c369dc\") " pod="openshift-cluster-node-tuning-operator/tuned-2cht2" Apr 28 19:15:45.034088 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.034030 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/de956b02-f02a-4203-a743-d9efee946739-hosts-file\") pod \"node-resolver-fqkh6\" (UID: \"de956b02-f02a-4203-a743-d9efee946739\") " pod="openshift-dns/node-resolver-fqkh6" Apr 28 19:15:45.034088 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.034059 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d27f468f-a5ab-460e-8afc-5ff534c369dc-var-lib-kubelet\") pod \"tuned-2cht2\" (UID: \"d27f468f-a5ab-460e-8afc-5ff534c369dc\") " pod="openshift-cluster-node-tuning-operator/tuned-2cht2" Apr 28 19:15:45.034088 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.034093 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f56b141f-364e-495d-9046-30f1c93dbc83-cnibin\") pod \"multus-additional-cni-plugins-ls4b6\" (UID: \"f56b141f-364e-495d-9046-30f1c93dbc83\") " pod="openshift-multus/multus-additional-cni-plugins-ls4b6" Apr 28 19:15:45.034268 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.034127 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f56b141f-364e-495d-9046-30f1c93dbc83-cni-binary-copy\") pod \"multus-additional-cni-plugins-ls4b6\" (UID: \"f56b141f-364e-495d-9046-30f1c93dbc83\") " pod="openshift-multus/multus-additional-cni-plugins-ls4b6" Apr 28 19:15:45.034268 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.034232 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/023534f3-e54d-45bb-b99b-12a35302ae01-host-run-netns\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.034347 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.034278 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/023534f3-e54d-45bb-b99b-12a35302ae01-host-var-lib-cni-multus\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.034347 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.034312 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wzjj\" (UniqueName: \"kubernetes.io/projected/d27f468f-a5ab-460e-8afc-5ff534c369dc-kube-api-access-7wzjj\") pod \"tuned-2cht2\" (UID: \"d27f468f-a5ab-460e-8afc-5ff534c369dc\") " pod="openshift-cluster-node-tuning-operator/tuned-2cht2" Apr 28 19:15:45.034453 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.034343 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdxgz\" (UniqueName: \"kubernetes.io/projected/40262fc5-4234-4213-8739-f1ff807f34ec-kube-api-access-kdxgz\") pod \"network-check-target-8gbf7\" (UID: \"40262fc5-4234-4213-8739-f1ff807f34ec\") " pod="openshift-network-diagnostics/network-check-target-8gbf7" Apr 28 19:15:45.034453 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.034370 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/023534f3-e54d-45bb-b99b-12a35302ae01-host-var-lib-cni-bin\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.034453 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.034424 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/023534f3-e54d-45bb-b99b-12a35302ae01-host-var-lib-kubelet\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.034567 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.034455 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/023534f3-e54d-45bb-b99b-12a35302ae01-multus-daemon-config\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.034567 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.034503 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/023534f3-e54d-45bb-b99b-12a35302ae01-host-run-k8s-cni-cncf-io\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.034567 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.034533 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/023534f3-e54d-45bb-b99b-12a35302ae01-host-run-multus-certs\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.034684 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.034594 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7b2a3928-8ff2-43ed-90d0-159376b2c777-socket-dir\") pod \"aws-ebs-csi-driver-node-h22m8\" (UID: \"7b2a3928-8ff2-43ed-90d0-159376b2c777\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-h22m8" Apr 28 19:15:45.034684 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.034629 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7b2a3928-8ff2-43ed-90d0-159376b2c777-etc-selinux\") pod \"aws-ebs-csi-driver-node-h22m8\" (UID: \"7b2a3928-8ff2-43ed-90d0-159376b2c777\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-h22m8" Apr 28 19:15:45.034684 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.034659 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f56b141f-364e-495d-9046-30f1c93dbc83-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ls4b6\" (UID: \"f56b141f-364e-495d-9046-30f1c93dbc83\") " pod="openshift-multus/multus-additional-cni-plugins-ls4b6" Apr 28 19:15:45.034841 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.034729 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/023534f3-e54d-45bb-b99b-12a35302ae01-cni-binary-copy\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.034841 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.034780 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7b2a3928-8ff2-43ed-90d0-159376b2c777-kubelet-dir\") pod \"aws-ebs-csi-driver-node-h22m8\" (UID: \"7b2a3928-8ff2-43ed-90d0-159376b2c777\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-h22m8" Apr 28 19:15:45.034841 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.034815 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d27f468f-a5ab-460e-8afc-5ff534c369dc-etc-kubernetes\") pod \"tuned-2cht2\" (UID: \"d27f468f-a5ab-460e-8afc-5ff534c369dc\") " pod="openshift-cluster-node-tuning-operator/tuned-2cht2" Apr 28 19:15:45.035021 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.034847 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d27f468f-a5ab-460e-8afc-5ff534c369dc-host\") pod \"tuned-2cht2\" (UID: \"d27f468f-a5ab-460e-8afc-5ff534c369dc\") " pod="openshift-cluster-node-tuning-operator/tuned-2cht2" Apr 28 19:15:45.035021 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.034874 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/023534f3-e54d-45bb-b99b-12a35302ae01-os-release\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.035021 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.034921 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/4a28734c-75dc-4444-ae1f-b70d31a241e2-agent-certs\") pod \"konnectivity-agent-dn4qh\" (UID: \"4a28734c-75dc-4444-ae1f-b70d31a241e2\") " pod="kube-system/konnectivity-agent-dn4qh" Apr 28 19:15:45.035021 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.034955 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7b2a3928-8ff2-43ed-90d0-159376b2c777-registration-dir\") pod \"aws-ebs-csi-driver-node-h22m8\" (UID: \"7b2a3928-8ff2-43ed-90d0-159376b2c777\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-h22m8" Apr 28 19:15:45.035021 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.034990 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7b2a3928-8ff2-43ed-90d0-159376b2c777-sys-fs\") pod \"aws-ebs-csi-driver-node-h22m8\" (UID: \"7b2a3928-8ff2-43ed-90d0-159376b2c777\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-h22m8" Apr 28 19:15:45.035291 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.035024 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/023534f3-e54d-45bb-b99b-12a35302ae01-etc-kubernetes\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.035291 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.035054 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/96de079e-3abf-48db-8ecf-bcd571c3ed27-host\") pod \"node-ca-clds8\" (UID: \"96de079e-3abf-48db-8ecf-bcd571c3ed27\") " pod="openshift-image-registry/node-ca-clds8" Apr 28 19:15:45.035291 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.035087 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f56b141f-364e-495d-9046-30f1c93dbc83-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ls4b6\" (UID: \"f56b141f-364e-495d-9046-30f1c93dbc83\") " pod="openshift-multus/multus-additional-cni-plugins-ls4b6" Apr 28 19:15:45.035291 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.035119 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q95kh\" (UniqueName: \"kubernetes.io/projected/023534f3-e54d-45bb-b99b-12a35302ae01-kube-api-access-q95kh\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.035291 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.035151 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff54s\" (UniqueName: \"kubernetes.io/projected/7b2a3928-8ff2-43ed-90d0-159376b2c777-kube-api-access-ff54s\") pod \"aws-ebs-csi-driver-node-h22m8\" (UID: \"7b2a3928-8ff2-43ed-90d0-159376b2c777\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-h22m8" Apr 28 19:15:45.035291 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.035181 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d27f468f-a5ab-460e-8afc-5ff534c369dc-etc-systemd\") pod \"tuned-2cht2\" (UID: \"d27f468f-a5ab-460e-8afc-5ff534c369dc\") " pod="openshift-cluster-node-tuning-operator/tuned-2cht2" Apr 28 19:15:45.035291 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.035214 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/96de079e-3abf-48db-8ecf-bcd571c3ed27-serviceca\") pod \"node-ca-clds8\" (UID: \"96de079e-3abf-48db-8ecf-bcd571c3ed27\") " pod="openshift-image-registry/node-ca-clds8" Apr 28 19:15:45.035291 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.035248 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/023534f3-e54d-45bb-b99b-12a35302ae01-system-cni-dir\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.035291 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.035282 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/023534f3-e54d-45bb-b99b-12a35302ae01-cnibin\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.035763 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.035313 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/4a28734c-75dc-4444-ae1f-b70d31a241e2-konnectivity-ca\") pod \"konnectivity-agent-dn4qh\" (UID: \"4a28734c-75dc-4444-ae1f-b70d31a241e2\") " pod="kube-system/konnectivity-agent-dn4qh" Apr 28 19:15:45.035763 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.035327 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 28 19:15:45.035763 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.035346 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/de956b02-f02a-4203-a743-d9efee946739-tmp-dir\") pod \"node-resolver-fqkh6\" (UID: \"de956b02-f02a-4203-a743-d9efee946739\") " pod="openshift-dns/node-resolver-fqkh6" Apr 28 19:15:45.035763 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.035374 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d27f468f-a5ab-460e-8afc-5ff534c369dc-run\") pod \"tuned-2cht2\" (UID: \"d27f468f-a5ab-460e-8afc-5ff534c369dc\") " pod="openshift-cluster-node-tuning-operator/tuned-2cht2" Apr 28 19:15:45.035763 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.035419 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d27f468f-a5ab-460e-8afc-5ff534c369dc-sys\") pod \"tuned-2cht2\" (UID: \"d27f468f-a5ab-460e-8afc-5ff534c369dc\") " pod="openshift-cluster-node-tuning-operator/tuned-2cht2" Apr 28 19:15:45.035763 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.035641 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 28 19:15:45.035763 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.035733 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 28 19:15:45.036759 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.036729 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25jxx\" (UniqueName: \"kubernetes.io/projected/96de079e-3abf-48db-8ecf-bcd571c3ed27-kube-api-access-25jxx\") pod \"node-ca-clds8\" (UID: \"96de079e-3abf-48db-8ecf-bcd571c3ed27\") " pod="openshift-image-registry/node-ca-clds8" Apr 28 19:15:45.036849 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.036774 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7b2a3928-8ff2-43ed-90d0-159376b2c777-device-dir\") pod \"aws-ebs-csi-driver-node-h22m8\" (UID: \"7b2a3928-8ff2-43ed-90d0-159376b2c777\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-h22m8" Apr 28 19:15:45.036849 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.036806 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f56b141f-364e-495d-9046-30f1c93dbc83-os-release\") pod \"multus-additional-cni-plugins-ls4b6\" (UID: \"f56b141f-364e-495d-9046-30f1c93dbc83\") " pod="openshift-multus/multus-additional-cni-plugins-ls4b6" Apr 28 19:15:45.036849 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.036833 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jzns\" (UniqueName: \"kubernetes.io/projected/de956b02-f02a-4203-a743-d9efee946739-kube-api-access-6jzns\") pod \"node-resolver-fqkh6\" (UID: \"de956b02-f02a-4203-a743-d9efee946739\") " pod="openshift-dns/node-resolver-fqkh6" Apr 28 19:15:45.037033 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.036859 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d27f468f-a5ab-460e-8afc-5ff534c369dc-etc-sysctl-d\") pod \"tuned-2cht2\" (UID: \"d27f468f-a5ab-460e-8afc-5ff534c369dc\") " pod="openshift-cluster-node-tuning-operator/tuned-2cht2" Apr 28 19:15:45.037033 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.036884 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/023534f3-e54d-45bb-b99b-12a35302ae01-hostroot\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.037033 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.036934 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d27f468f-a5ab-460e-8afc-5ff534c369dc-etc-sysctl-conf\") pod \"tuned-2cht2\" (UID: \"d27f468f-a5ab-460e-8afc-5ff534c369dc\") " pod="openshift-cluster-node-tuning-operator/tuned-2cht2" Apr 28 19:15:45.037033 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.036961 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dllgx\" (UniqueName: \"kubernetes.io/projected/f56b141f-364e-495d-9046-30f1c93dbc83-kube-api-access-dllgx\") pod \"multus-additional-cni-plugins-ls4b6\" (UID: \"f56b141f-364e-495d-9046-30f1c93dbc83\") " pod="openshift-multus/multus-additional-cni-plugins-ls4b6" Apr 28 19:15:45.037033 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.036989 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/023534f3-e54d-45bb-b99b-12a35302ae01-multus-socket-dir-parent\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.037033 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.037015 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f56b141f-364e-495d-9046-30f1c93dbc83-system-cni-dir\") pod \"multus-additional-cni-plugins-ls4b6\" (UID: \"f56b141f-364e-495d-9046-30f1c93dbc83\") " pod="openshift-multus/multus-additional-cni-plugins-ls4b6" Apr 28 19:15:45.037413 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.037040 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/023534f3-e54d-45bb-b99b-12a35302ae01-multus-cni-dir\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.037413 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.037065 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d27f468f-a5ab-460e-8afc-5ff534c369dc-etc-tuned\") pod \"tuned-2cht2\" (UID: \"d27f468f-a5ab-460e-8afc-5ff534c369dc\") " pod="openshift-cluster-node-tuning-operator/tuned-2cht2" Apr 28 19:15:45.037413 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.037091 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d27f468f-a5ab-460e-8afc-5ff534c369dc-tmp\") pod \"tuned-2cht2\" (UID: \"d27f468f-a5ab-460e-8afc-5ff534c369dc\") " pod="openshift-cluster-node-tuning-operator/tuned-2cht2" Apr 28 19:15:45.037413 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.037118 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f56b141f-364e-495d-9046-30f1c93dbc83-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ls4b6\" (UID: \"f56b141f-364e-495d-9046-30f1c93dbc83\") " pod="openshift-multus/multus-additional-cni-plugins-ls4b6" Apr 28 19:15:45.037413 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.037200 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 28 19:15:45.037413 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.037308 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 28 19:15:45.037413 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.037319 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-97snv\"" Apr 28 19:15:45.037413 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.037396 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 28 19:15:45.102610 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.102567 2565 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-27 19:10:44 +0000 UTC" deadline="2028-01-16 09:20:58.814197474 +0000 UTC" Apr 28 19:15:45.102610 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.102598 2565 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15062h5m13.711601309s" Apr 28 19:15:45.124981 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.124964 2565 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 28 19:15:45.137967 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.137943 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxhjx\" (UniqueName: \"kubernetes.io/projected/02955ab5-cc29-48ff-8727-30e4575778cb-kube-api-access-qxhjx\") pod \"iptables-alerter-tntl8\" (UID: \"02955ab5-cc29-48ff-8727-30e4575778cb\") " pod="openshift-network-operator/iptables-alerter-tntl8" Apr 28 19:15:45.138064 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.137984 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/023534f3-e54d-45bb-b99b-12a35302ae01-os-release\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.138064 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.138032 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/4a28734c-75dc-4444-ae1f-b70d31a241e2-agent-certs\") pod \"konnectivity-agent-dn4qh\" (UID: \"4a28734c-75dc-4444-ae1f-b70d31a241e2\") " pod="kube-system/konnectivity-agent-dn4qh" Apr 28 19:15:45.138178 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.138096 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7b2a3928-8ff2-43ed-90d0-159376b2c777-registration-dir\") pod \"aws-ebs-csi-driver-node-h22m8\" (UID: \"7b2a3928-8ff2-43ed-90d0-159376b2c777\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-h22m8" Apr 28 19:15:45.138178 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.138125 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7b2a3928-8ff2-43ed-90d0-159376b2c777-sys-fs\") pod \"aws-ebs-csi-driver-node-h22m8\" (UID: \"7b2a3928-8ff2-43ed-90d0-159376b2c777\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-h22m8" Apr 28 19:15:45.138178 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.138148 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/023534f3-e54d-45bb-b99b-12a35302ae01-os-release\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.138316 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.138153 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvnf6\" (UniqueName: \"kubernetes.io/projected/09653a58-e44e-4fb5-a021-58bc08a4765f-kube-api-access-vvnf6\") pod \"network-metrics-daemon-88gvq\" (UID: \"09653a58-e44e-4fb5-a021-58bc08a4765f\") " pod="openshift-multus/network-metrics-daemon-88gvq" Apr 28 19:15:45.138316 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.138210 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1a521797-0035-4102-b6e7-e3757c2a296e-log-socket\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.138316 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.138247 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1a521797-0035-4102-b6e7-e3757c2a296e-host-cni-bin\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.138316 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.138275 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a521797-0035-4102-b6e7-e3757c2a296e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.138316 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.138296 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7b2a3928-8ff2-43ed-90d0-159376b2c777-sys-fs\") pod \"aws-ebs-csi-driver-node-h22m8\" (UID: \"7b2a3928-8ff2-43ed-90d0-159376b2c777\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-h22m8" Apr 28 19:15:45.138316 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.138304 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/023534f3-e54d-45bb-b99b-12a35302ae01-etc-kubernetes\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.138614 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.138334 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/96de079e-3abf-48db-8ecf-bcd571c3ed27-host\") pod \"node-ca-clds8\" (UID: \"96de079e-3abf-48db-8ecf-bcd571c3ed27\") " pod="openshift-image-registry/node-ca-clds8" Apr 28 19:15:45.138614 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.138344 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/023534f3-e54d-45bb-b99b-12a35302ae01-etc-kubernetes\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.138614 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.138362 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1a521797-0035-4102-b6e7-e3757c2a296e-host-kubelet\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.138614 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.138356 2565 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 28 19:15:45.138614 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.138417 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1a521797-0035-4102-b6e7-e3757c2a296e-ovn-node-metrics-cert\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.138614 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.138433 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/96de079e-3abf-48db-8ecf-bcd571c3ed27-host\") pod \"node-ca-clds8\" (UID: \"96de079e-3abf-48db-8ecf-bcd571c3ed27\") " pod="openshift-image-registry/node-ca-clds8" Apr 28 19:15:45.138614 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.138443 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/02955ab5-cc29-48ff-8727-30e4575778cb-host-slash\") pod \"iptables-alerter-tntl8\" (UID: \"02955ab5-cc29-48ff-8727-30e4575778cb\") " pod="openshift-network-operator/iptables-alerter-tntl8" Apr 28 19:15:45.138614 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.138496 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f56b141f-364e-495d-9046-30f1c93dbc83-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ls4b6\" (UID: \"f56b141f-364e-495d-9046-30f1c93dbc83\") " pod="openshift-multus/multus-additional-cni-plugins-ls4b6" Apr 28 19:15:45.138614 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.138527 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q95kh\" (UniqueName: \"kubernetes.io/projected/023534f3-e54d-45bb-b99b-12a35302ae01-kube-api-access-q95kh\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.138614 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.138554 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ff54s\" (UniqueName: \"kubernetes.io/projected/7b2a3928-8ff2-43ed-90d0-159376b2c777-kube-api-access-ff54s\") pod \"aws-ebs-csi-driver-node-h22m8\" (UID: \"7b2a3928-8ff2-43ed-90d0-159376b2c777\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-h22m8" Apr 28 19:15:45.138614 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.138580 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d27f468f-a5ab-460e-8afc-5ff534c369dc-etc-systemd\") pod \"tuned-2cht2\" (UID: \"d27f468f-a5ab-460e-8afc-5ff534c369dc\") " pod="openshift-cluster-node-tuning-operator/tuned-2cht2" Apr 28 19:15:45.138614 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.138605 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/96de079e-3abf-48db-8ecf-bcd571c3ed27-serviceca\") pod \"node-ca-clds8\" (UID: \"96de079e-3abf-48db-8ecf-bcd571c3ed27\") " pod="openshift-image-registry/node-ca-clds8" Apr 28 19:15:45.139180 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.138630 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1a521797-0035-4102-b6e7-e3757c2a296e-host-slash\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.139180 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.138656 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a521797-0035-4102-b6e7-e3757c2a296e-etc-openvswitch\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.139180 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.138681 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/023534f3-e54d-45bb-b99b-12a35302ae01-system-cni-dir\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.139180 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.138706 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/023534f3-e54d-45bb-b99b-12a35302ae01-cnibin\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.139180 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.138731 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/4a28734c-75dc-4444-ae1f-b70d31a241e2-konnectivity-ca\") pod \"konnectivity-agent-dn4qh\" (UID: \"4a28734c-75dc-4444-ae1f-b70d31a241e2\") " pod="kube-system/konnectivity-agent-dn4qh" Apr 28 19:15:45.139180 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.138755 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/de956b02-f02a-4203-a743-d9efee946739-tmp-dir\") pod \"node-resolver-fqkh6\" (UID: \"de956b02-f02a-4203-a743-d9efee946739\") " pod="openshift-dns/node-resolver-fqkh6" Apr 28 19:15:45.139180 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.138777 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d27f468f-a5ab-460e-8afc-5ff534c369dc-run\") pod \"tuned-2cht2\" (UID: \"d27f468f-a5ab-460e-8afc-5ff534c369dc\") " pod="openshift-cluster-node-tuning-operator/tuned-2cht2" Apr 28 19:15:45.139180 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.138814 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d27f468f-a5ab-460e-8afc-5ff534c369dc-sys\") pod \"tuned-2cht2\" (UID: \"d27f468f-a5ab-460e-8afc-5ff534c369dc\") " pod="openshift-cluster-node-tuning-operator/tuned-2cht2" Apr 28 19:15:45.139180 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.138842 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-25jxx\" (UniqueName: \"kubernetes.io/projected/96de079e-3abf-48db-8ecf-bcd571c3ed27-kube-api-access-25jxx\") pod \"node-ca-clds8\" (UID: \"96de079e-3abf-48db-8ecf-bcd571c3ed27\") " pod="openshift-image-registry/node-ca-clds8" Apr 28 19:15:45.139180 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.138870 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a521797-0035-4102-b6e7-e3757c2a296e-var-lib-openvswitch\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.139180 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.138933 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7b2a3928-8ff2-43ed-90d0-159376b2c777-device-dir\") pod \"aws-ebs-csi-driver-node-h22m8\" (UID: \"7b2a3928-8ff2-43ed-90d0-159376b2c777\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-h22m8" Apr 28 19:15:45.139180 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.138960 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a521797-0035-4102-b6e7-e3757c2a296e-run-openvswitch\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.139180 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.138992 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f56b141f-364e-495d-9046-30f1c93dbc83-os-release\") pod \"multus-additional-cni-plugins-ls4b6\" (UID: \"f56b141f-364e-495d-9046-30f1c93dbc83\") " pod="openshift-multus/multus-additional-cni-plugins-ls4b6" Apr 28 19:15:45.139180 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.139045 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6jzns\" (UniqueName: \"kubernetes.io/projected/de956b02-f02a-4203-a743-d9efee946739-kube-api-access-6jzns\") pod \"node-resolver-fqkh6\" (UID: \"de956b02-f02a-4203-a743-d9efee946739\") " pod="openshift-dns/node-resolver-fqkh6" Apr 28 19:15:45.139180 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.139070 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d27f468f-a5ab-460e-8afc-5ff534c369dc-etc-sysctl-d\") pod \"tuned-2cht2\" (UID: \"d27f468f-a5ab-460e-8afc-5ff534c369dc\") " pod="openshift-cluster-node-tuning-operator/tuned-2cht2" Apr 28 19:15:45.139180 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.139097 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1a521797-0035-4102-b6e7-e3757c2a296e-ovnkube-script-lib\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.139180 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.139124 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/023534f3-e54d-45bb-b99b-12a35302ae01-hostroot\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.139959 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.139124 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f56b141f-364e-495d-9046-30f1c93dbc83-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ls4b6\" (UID: \"f56b141f-364e-495d-9046-30f1c93dbc83\") " pod="openshift-multus/multus-additional-cni-plugins-ls4b6" Apr 28 19:15:45.139959 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.139150 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d27f468f-a5ab-460e-8afc-5ff534c369dc-etc-sysctl-conf\") pod \"tuned-2cht2\" (UID: \"d27f468f-a5ab-460e-8afc-5ff534c369dc\") " pod="openshift-cluster-node-tuning-operator/tuned-2cht2" Apr 28 19:15:45.139959 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.139197 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/96de079e-3abf-48db-8ecf-bcd571c3ed27-serviceca\") pod \"node-ca-clds8\" (UID: \"96de079e-3abf-48db-8ecf-bcd571c3ed27\") " pod="openshift-image-registry/node-ca-clds8" Apr 28 19:15:45.139959 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.139267 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d27f468f-a5ab-460e-8afc-5ff534c369dc-etc-sysctl-conf\") pod \"tuned-2cht2\" (UID: \"d27f468f-a5ab-460e-8afc-5ff534c369dc\") " pod="openshift-cluster-node-tuning-operator/tuned-2cht2" Apr 28 19:15:45.139959 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.139333 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7b2a3928-8ff2-43ed-90d0-159376b2c777-device-dir\") pod \"aws-ebs-csi-driver-node-h22m8\" (UID: \"7b2a3928-8ff2-43ed-90d0-159376b2c777\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-h22m8" Apr 28 19:15:45.139959 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.138248 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7b2a3928-8ff2-43ed-90d0-159376b2c777-registration-dir\") pod \"aws-ebs-csi-driver-node-h22m8\" (UID: \"7b2a3928-8ff2-43ed-90d0-159376b2c777\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-h22m8" Apr 28 19:15:45.139959 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.139400 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f56b141f-364e-495d-9046-30f1c93dbc83-os-release\") pod \"multus-additional-cni-plugins-ls4b6\" (UID: \"f56b141f-364e-495d-9046-30f1c93dbc83\") " pod="openshift-multus/multus-additional-cni-plugins-ls4b6" Apr 28 19:15:45.139959 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.139438 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09653a58-e44e-4fb5-a021-58bc08a4765f-metrics-certs\") pod \"network-metrics-daemon-88gvq\" (UID: \"09653a58-e44e-4fb5-a021-58bc08a4765f\") " pod="openshift-multus/network-metrics-daemon-88gvq" Apr 28 19:15:45.139959 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.139471 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/023534f3-e54d-45bb-b99b-12a35302ae01-cnibin\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.139959 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.139478 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1a521797-0035-4102-b6e7-e3757c2a296e-ovnkube-config\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.139959 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.139511 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d27f468f-a5ab-460e-8afc-5ff534c369dc-etc-systemd\") pod \"tuned-2cht2\" (UID: \"d27f468f-a5ab-460e-8afc-5ff534c369dc\") " pod="openshift-cluster-node-tuning-operator/tuned-2cht2" Apr 28 19:15:45.139959 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.139511 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dllgx\" (UniqueName: \"kubernetes.io/projected/f56b141f-364e-495d-9046-30f1c93dbc83-kube-api-access-dllgx\") pod \"multus-additional-cni-plugins-ls4b6\" (UID: \"f56b141f-364e-495d-9046-30f1c93dbc83\") " pod="openshift-multus/multus-additional-cni-plugins-ls4b6" Apr 28 19:15:45.139959 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.139547 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/023534f3-e54d-45bb-b99b-12a35302ae01-multus-socket-dir-parent\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.139959 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.139574 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1a521797-0035-4102-b6e7-e3757c2a296e-run-systemd\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.139959 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.139601 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f56b141f-364e-495d-9046-30f1c93dbc83-system-cni-dir\") pod \"multus-additional-cni-plugins-ls4b6\" (UID: \"f56b141f-364e-495d-9046-30f1c93dbc83\") " pod="openshift-multus/multus-additional-cni-plugins-ls4b6" Apr 28 19:15:45.139959 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.139620 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/023534f3-e54d-45bb-b99b-12a35302ae01-system-cni-dir\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.139959 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.139641 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/023534f3-e54d-45bb-b99b-12a35302ae01-multus-cni-dir\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.140731 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.139671 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d27f468f-a5ab-460e-8afc-5ff534c369dc-run\") pod \"tuned-2cht2\" (UID: \"d27f468f-a5ab-460e-8afc-5ff534c369dc\") " pod="openshift-cluster-node-tuning-operator/tuned-2cht2" Apr 28 19:15:45.140731 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.139666 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d27f468f-a5ab-460e-8afc-5ff534c369dc-etc-tuned\") pod \"tuned-2cht2\" (UID: \"d27f468f-a5ab-460e-8afc-5ff534c369dc\") " pod="openshift-cluster-node-tuning-operator/tuned-2cht2" Apr 28 19:15:45.140731 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.139724 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d27f468f-a5ab-460e-8afc-5ff534c369dc-tmp\") pod \"tuned-2cht2\" (UID: \"d27f468f-a5ab-460e-8afc-5ff534c369dc\") " pod="openshift-cluster-node-tuning-operator/tuned-2cht2" Apr 28 19:15:45.140731 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.139755 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f56b141f-364e-495d-9046-30f1c93dbc83-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ls4b6\" (UID: \"f56b141f-364e-495d-9046-30f1c93dbc83\") " pod="openshift-multus/multus-additional-cni-plugins-ls4b6" Apr 28 19:15:45.140731 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.139800 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f56b141f-364e-495d-9046-30f1c93dbc83-system-cni-dir\") pod \"multus-additional-cni-plugins-ls4b6\" (UID: \"f56b141f-364e-495d-9046-30f1c93dbc83\") " pod="openshift-multus/multus-additional-cni-plugins-ls4b6" Apr 28 19:15:45.140731 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.140107 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d27f468f-a5ab-460e-8afc-5ff534c369dc-sys\") pod \"tuned-2cht2\" (UID: \"d27f468f-a5ab-460e-8afc-5ff534c369dc\") " pod="openshift-cluster-node-tuning-operator/tuned-2cht2" Apr 28 19:15:45.140731 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.140184 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/4a28734c-75dc-4444-ae1f-b70d31a241e2-konnectivity-ca\") pod \"konnectivity-agent-dn4qh\" (UID: \"4a28734c-75dc-4444-ae1f-b70d31a241e2\") " pod="kube-system/konnectivity-agent-dn4qh" Apr 28 19:15:45.140731 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.140233 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d27f468f-a5ab-460e-8afc-5ff534c369dc-etc-sysctl-d\") pod \"tuned-2cht2\" (UID: \"d27f468f-a5ab-460e-8afc-5ff534c369dc\") " pod="openshift-cluster-node-tuning-operator/tuned-2cht2" Apr 28 19:15:45.140731 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.140277 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/023534f3-e54d-45bb-b99b-12a35302ae01-hostroot\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.140731 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.140291 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/023534f3-e54d-45bb-b99b-12a35302ae01-multus-socket-dir-parent\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.140731 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.140449 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/023534f3-e54d-45bb-b99b-12a35302ae01-multus-cni-dir\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.140731 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.140455 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/de956b02-f02a-4203-a743-d9efee946739-tmp-dir\") pod \"node-resolver-fqkh6\" (UID: \"de956b02-f02a-4203-a743-d9efee946739\") " pod="openshift-dns/node-resolver-fqkh6" Apr 28 19:15:45.140731 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.140498 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/023534f3-e54d-45bb-b99b-12a35302ae01-multus-conf-dir\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.140731 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.140534 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/023534f3-e54d-45bb-b99b-12a35302ae01-multus-conf-dir\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.140731 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.140538 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d27f468f-a5ab-460e-8afc-5ff534c369dc-etc-modprobe-d\") pod \"tuned-2cht2\" (UID: \"d27f468f-a5ab-460e-8afc-5ff534c369dc\") " pod="openshift-cluster-node-tuning-operator/tuned-2cht2" Apr 28 19:15:45.140731 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.140567 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d27f468f-a5ab-460e-8afc-5ff534c369dc-etc-sysconfig\") pod \"tuned-2cht2\" (UID: \"d27f468f-a5ab-460e-8afc-5ff534c369dc\") " pod="openshift-cluster-node-tuning-operator/tuned-2cht2" Apr 28 19:15:45.140731 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.140595 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d27f468f-a5ab-460e-8afc-5ff534c369dc-lib-modules\") pod \"tuned-2cht2\" (UID: \"d27f468f-a5ab-460e-8afc-5ff534c369dc\") " pod="openshift-cluster-node-tuning-operator/tuned-2cht2" Apr 28 19:15:45.140731 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.140616 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d27f468f-a5ab-460e-8afc-5ff534c369dc-etc-sysconfig\") pod \"tuned-2cht2\" (UID: \"d27f468f-a5ab-460e-8afc-5ff534c369dc\") " pod="openshift-cluster-node-tuning-operator/tuned-2cht2" Apr 28 19:15:45.142598 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.140624 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1a521797-0035-4102-b6e7-e3757c2a296e-host-run-netns\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.142598 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.140651 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1a521797-0035-4102-b6e7-e3757c2a296e-host-cni-netd\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.142598 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.140678 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/de956b02-f02a-4203-a743-d9efee946739-hosts-file\") pod \"node-resolver-fqkh6\" (UID: \"de956b02-f02a-4203-a743-d9efee946739\") " pod="openshift-dns/node-resolver-fqkh6" Apr 28 19:15:45.142598 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.140702 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d27f468f-a5ab-460e-8afc-5ff534c369dc-lib-modules\") pod \"tuned-2cht2\" (UID: \"d27f468f-a5ab-460e-8afc-5ff534c369dc\") " pod="openshift-cluster-node-tuning-operator/tuned-2cht2" Apr 28 19:15:45.142598 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.140709 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d27f468f-a5ab-460e-8afc-5ff534c369dc-var-lib-kubelet\") pod \"tuned-2cht2\" (UID: \"d27f468f-a5ab-460e-8afc-5ff534c369dc\") " pod="openshift-cluster-node-tuning-operator/tuned-2cht2" Apr 28 19:15:45.142598 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.140739 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvxzw\" (UniqueName: \"kubernetes.io/projected/1a521797-0035-4102-b6e7-e3757c2a296e-kube-api-access-pvxzw\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.142598 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.140745 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/de956b02-f02a-4203-a743-d9efee946739-hosts-file\") pod \"node-resolver-fqkh6\" (UID: \"de956b02-f02a-4203-a743-d9efee946739\") " pod="openshift-dns/node-resolver-fqkh6" Apr 28 19:15:45.142598 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.140738 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f56b141f-364e-495d-9046-30f1c93dbc83-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ls4b6\" (UID: \"f56b141f-364e-495d-9046-30f1c93dbc83\") " pod="openshift-multus/multus-additional-cni-plugins-ls4b6" Apr 28 19:15:45.142598 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.140789 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f56b141f-364e-495d-9046-30f1c93dbc83-cnibin\") pod \"multus-additional-cni-plugins-ls4b6\" (UID: \"f56b141f-364e-495d-9046-30f1c93dbc83\") " pod="openshift-multus/multus-additional-cni-plugins-ls4b6" Apr 28 19:15:45.142598 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.140817 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d27f468f-a5ab-460e-8afc-5ff534c369dc-var-lib-kubelet\") pod \"tuned-2cht2\" (UID: \"d27f468f-a5ab-460e-8afc-5ff534c369dc\") " pod="openshift-cluster-node-tuning-operator/tuned-2cht2" Apr 28 19:15:45.142598 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.140828 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f56b141f-364e-495d-9046-30f1c93dbc83-cni-binary-copy\") pod \"multus-additional-cni-plugins-ls4b6\" (UID: \"f56b141f-364e-495d-9046-30f1c93dbc83\") " pod="openshift-multus/multus-additional-cni-plugins-ls4b6" Apr 28 19:15:45.142598 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.140857 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/023534f3-e54d-45bb-b99b-12a35302ae01-host-run-netns\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.142598 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.140865 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f56b141f-364e-495d-9046-30f1c93dbc83-cnibin\") pod \"multus-additional-cni-plugins-ls4b6\" (UID: \"f56b141f-364e-495d-9046-30f1c93dbc83\") " pod="openshift-multus/multus-additional-cni-plugins-ls4b6" Apr 28 19:15:45.142598 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.140883 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/023534f3-e54d-45bb-b99b-12a35302ae01-host-var-lib-cni-multus\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.142598 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.140920 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/023534f3-e54d-45bb-b99b-12a35302ae01-host-run-netns\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.142598 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.140925 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7wzjj\" (UniqueName: \"kubernetes.io/projected/d27f468f-a5ab-460e-8afc-5ff534c369dc-kube-api-access-7wzjj\") pod \"tuned-2cht2\" (UID: \"d27f468f-a5ab-460e-8afc-5ff534c369dc\") " pod="openshift-cluster-node-tuning-operator/tuned-2cht2" Apr 28 19:15:45.142598 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.140953 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1a521797-0035-4102-b6e7-e3757c2a296e-systemd-units\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.143257 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.140957 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/023534f3-e54d-45bb-b99b-12a35302ae01-host-var-lib-cni-multus\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.143257 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.140985 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdxgz\" (UniqueName: \"kubernetes.io/projected/40262fc5-4234-4213-8739-f1ff807f34ec-kube-api-access-kdxgz\") pod \"network-check-target-8gbf7\" (UID: \"40262fc5-4234-4213-8739-f1ff807f34ec\") " pod="openshift-network-diagnostics/network-check-target-8gbf7" Apr 28 19:15:45.143257 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.141010 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/023534f3-e54d-45bb-b99b-12a35302ae01-host-var-lib-cni-bin\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.143257 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.141030 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/023534f3-e54d-45bb-b99b-12a35302ae01-host-var-lib-kubelet\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.143257 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.141051 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/023534f3-e54d-45bb-b99b-12a35302ae01-multus-daemon-config\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.143257 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.141074 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1a521797-0035-4102-b6e7-e3757c2a296e-run-ovn\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.143257 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.141095 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1a521797-0035-4102-b6e7-e3757c2a296e-node-log\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.143257 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.141116 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a521797-0035-4102-b6e7-e3757c2a296e-host-run-ovn-kubernetes\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.143257 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.141135 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1a521797-0035-4102-b6e7-e3757c2a296e-env-overrides\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.143257 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.141158 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/023534f3-e54d-45bb-b99b-12a35302ae01-host-run-k8s-cni-cncf-io\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.143257 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.141204 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/023534f3-e54d-45bb-b99b-12a35302ae01-host-run-k8s-cni-cncf-io\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.143257 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.141278 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/023534f3-e54d-45bb-b99b-12a35302ae01-host-var-lib-cni-bin\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.143257 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.141285 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f56b141f-364e-495d-9046-30f1c93dbc83-cni-binary-copy\") pod \"multus-additional-cni-plugins-ls4b6\" (UID: \"f56b141f-364e-495d-9046-30f1c93dbc83\") " pod="openshift-multus/multus-additional-cni-plugins-ls4b6" Apr 28 19:15:45.143257 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.141350 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/023534f3-e54d-45bb-b99b-12a35302ae01-host-var-lib-kubelet\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.143257 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.141406 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/023534f3-e54d-45bb-b99b-12a35302ae01-host-run-multus-certs\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.143257 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.141435 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7b2a3928-8ff2-43ed-90d0-159376b2c777-socket-dir\") pod \"aws-ebs-csi-driver-node-h22m8\" (UID: \"7b2a3928-8ff2-43ed-90d0-159376b2c777\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-h22m8" Apr 28 19:15:45.143257 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.141463 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7b2a3928-8ff2-43ed-90d0-159376b2c777-etc-selinux\") pod \"aws-ebs-csi-driver-node-h22m8\" (UID: \"7b2a3928-8ff2-43ed-90d0-159376b2c777\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-h22m8" Apr 28 19:15:45.143789 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.141493 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/02955ab5-cc29-48ff-8727-30e4575778cb-iptables-alerter-script\") pod \"iptables-alerter-tntl8\" (UID: \"02955ab5-cc29-48ff-8727-30e4575778cb\") " pod="openshift-network-operator/iptables-alerter-tntl8" Apr 28 19:15:45.143789 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.141528 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f56b141f-364e-495d-9046-30f1c93dbc83-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ls4b6\" (UID: \"f56b141f-364e-495d-9046-30f1c93dbc83\") " pod="openshift-multus/multus-additional-cni-plugins-ls4b6" Apr 28 19:15:45.143789 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.141560 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/023534f3-e54d-45bb-b99b-12a35302ae01-cni-binary-copy\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.143789 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.141590 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7b2a3928-8ff2-43ed-90d0-159376b2c777-socket-dir\") pod \"aws-ebs-csi-driver-node-h22m8\" (UID: \"7b2a3928-8ff2-43ed-90d0-159376b2c777\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-h22m8" Apr 28 19:15:45.143789 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.141605 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7b2a3928-8ff2-43ed-90d0-159376b2c777-kubelet-dir\") pod \"aws-ebs-csi-driver-node-h22m8\" (UID: \"7b2a3928-8ff2-43ed-90d0-159376b2c777\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-h22m8" Apr 28 19:15:45.143789 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.141639 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/023534f3-e54d-45bb-b99b-12a35302ae01-host-run-multus-certs\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.143789 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.141639 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d27f468f-a5ab-460e-8afc-5ff534c369dc-etc-kubernetes\") pod \"tuned-2cht2\" (UID: \"d27f468f-a5ab-460e-8afc-5ff534c369dc\") " pod="openshift-cluster-node-tuning-operator/tuned-2cht2" Apr 28 19:15:45.143789 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.141675 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d27f468f-a5ab-460e-8afc-5ff534c369dc-host\") pod \"tuned-2cht2\" (UID: \"d27f468f-a5ab-460e-8afc-5ff534c369dc\") " pod="openshift-cluster-node-tuning-operator/tuned-2cht2" Apr 28 19:15:45.143789 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.141693 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d27f468f-a5ab-460e-8afc-5ff534c369dc-etc-kubernetes\") pod \"tuned-2cht2\" (UID: \"d27f468f-a5ab-460e-8afc-5ff534c369dc\") " pod="openshift-cluster-node-tuning-operator/tuned-2cht2" Apr 28 19:15:45.143789 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.141713 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/023534f3-e54d-45bb-b99b-12a35302ae01-multus-daemon-config\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.143789 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.141746 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7b2a3928-8ff2-43ed-90d0-159376b2c777-etc-selinux\") pod \"aws-ebs-csi-driver-node-h22m8\" (UID: \"7b2a3928-8ff2-43ed-90d0-159376b2c777\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-h22m8" Apr 28 19:15:45.143789 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.141779 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d27f468f-a5ab-460e-8afc-5ff534c369dc-host\") pod \"tuned-2cht2\" (UID: \"d27f468f-a5ab-460e-8afc-5ff534c369dc\") " pod="openshift-cluster-node-tuning-operator/tuned-2cht2" Apr 28 19:15:45.143789 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.141836 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7b2a3928-8ff2-43ed-90d0-159376b2c777-kubelet-dir\") pod \"aws-ebs-csi-driver-node-h22m8\" (UID: \"7b2a3928-8ff2-43ed-90d0-159376b2c777\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-h22m8" Apr 28 19:15:45.143789 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.141873 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/4a28734c-75dc-4444-ae1f-b70d31a241e2-agent-certs\") pod \"konnectivity-agent-dn4qh\" (UID: \"4a28734c-75dc-4444-ae1f-b70d31a241e2\") " pod="kube-system/konnectivity-agent-dn4qh" Apr 28 19:15:45.143789 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.141845 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d27f468f-a5ab-460e-8afc-5ff534c369dc-etc-modprobe-d\") pod \"tuned-2cht2\" (UID: \"d27f468f-a5ab-460e-8afc-5ff534c369dc\") " pod="openshift-cluster-node-tuning-operator/tuned-2cht2" Apr 28 19:15:45.143789 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.142203 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/023534f3-e54d-45bb-b99b-12a35302ae01-cni-binary-copy\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.143789 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.142395 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d27f468f-a5ab-460e-8afc-5ff534c369dc-etc-tuned\") pod \"tuned-2cht2\" (UID: \"d27f468f-a5ab-460e-8afc-5ff534c369dc\") " pod="openshift-cluster-node-tuning-operator/tuned-2cht2" Apr 28 19:15:45.144375 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.142657 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f56b141f-364e-495d-9046-30f1c93dbc83-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ls4b6\" (UID: \"f56b141f-364e-495d-9046-30f1c93dbc83\") " pod="openshift-multus/multus-additional-cni-plugins-ls4b6" Apr 28 19:15:45.144375 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.142966 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d27f468f-a5ab-460e-8afc-5ff534c369dc-tmp\") pod \"tuned-2cht2\" (UID: \"d27f468f-a5ab-460e-8afc-5ff534c369dc\") " pod="openshift-cluster-node-tuning-operator/tuned-2cht2" Apr 28 19:15:45.157158 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:45.157131 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:15:45.157158 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:45.157156 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:15:45.157333 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:45.157169 2565 projected.go:194] Error preparing data for projected volume kube-api-access-kdxgz for pod openshift-network-diagnostics/network-check-target-8gbf7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:15:45.157333 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:45.157234 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/40262fc5-4234-4213-8739-f1ff807f34ec-kube-api-access-kdxgz podName:40262fc5-4234-4213-8739-f1ff807f34ec nodeName:}" failed. No retries permitted until 2026-04-28 19:15:45.65720877 +0000 UTC m=+3.056953831 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-kdxgz" (UniqueName: "kubernetes.io/projected/40262fc5-4234-4213-8739-f1ff807f34ec-kube-api-access-kdxgz") pod "network-check-target-8gbf7" (UID: "40262fc5-4234-4213-8739-f1ff807f34ec") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:15:45.158662 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.158642 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-25jxx\" (UniqueName: \"kubernetes.io/projected/96de079e-3abf-48db-8ecf-bcd571c3ed27-kube-api-access-25jxx\") pod \"node-ca-clds8\" (UID: \"96de079e-3abf-48db-8ecf-bcd571c3ed27\") " pod="openshift-image-registry/node-ca-clds8" Apr 28 19:15:45.161562 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.161539 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wzjj\" (UniqueName: \"kubernetes.io/projected/d27f468f-a5ab-460e-8afc-5ff534c369dc-kube-api-access-7wzjj\") pod \"tuned-2cht2\" (UID: \"d27f468f-a5ab-460e-8afc-5ff534c369dc\") " pod="openshift-cluster-node-tuning-operator/tuned-2cht2" Apr 28 19:15:45.161843 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.161809 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dllgx\" (UniqueName: \"kubernetes.io/projected/f56b141f-364e-495d-9046-30f1c93dbc83-kube-api-access-dllgx\") pod \"multus-additional-cni-plugins-ls4b6\" (UID: \"f56b141f-364e-495d-9046-30f1c93dbc83\") " pod="openshift-multus/multus-additional-cni-plugins-ls4b6" Apr 28 19:15:45.162837 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.162816 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q95kh\" (UniqueName: \"kubernetes.io/projected/023534f3-e54d-45bb-b99b-12a35302ae01-kube-api-access-q95kh\") pod \"multus-pps95\" (UID: \"023534f3-e54d-45bb-b99b-12a35302ae01\") " pod="openshift-multus/multus-pps95" Apr 28 19:15:45.170604 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.170584 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jzns\" (UniqueName: \"kubernetes.io/projected/de956b02-f02a-4203-a743-d9efee946739-kube-api-access-6jzns\") pod \"node-resolver-fqkh6\" (UID: \"de956b02-f02a-4203-a743-d9efee946739\") " pod="openshift-dns/node-resolver-fqkh6" Apr 28 19:15:45.172367 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.172346 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff54s\" (UniqueName: \"kubernetes.io/projected/7b2a3928-8ff2-43ed-90d0-159376b2c777-kube-api-access-ff54s\") pod \"aws-ebs-csi-driver-node-h22m8\" (UID: \"7b2a3928-8ff2-43ed-90d0-159376b2c777\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-h22m8" Apr 28 19:15:45.177114 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.177093 2565 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:15:45.242934 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.242881 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a521797-0035-4102-b6e7-e3757c2a296e-var-lib-openvswitch\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.242934 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.242939 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a521797-0035-4102-b6e7-e3757c2a296e-run-openvswitch\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.243173 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.242966 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1a521797-0035-4102-b6e7-e3757c2a296e-ovnkube-script-lib\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.243173 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.242992 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09653a58-e44e-4fb5-a021-58bc08a4765f-metrics-certs\") pod \"network-metrics-daemon-88gvq\" (UID: \"09653a58-e44e-4fb5-a021-58bc08a4765f\") " pod="openshift-multus/network-metrics-daemon-88gvq" Apr 28 19:15:45.243173 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.242995 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a521797-0035-4102-b6e7-e3757c2a296e-var-lib-openvswitch\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.243173 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.243015 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1a521797-0035-4102-b6e7-e3757c2a296e-ovnkube-config\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.243173 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.243061 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1a521797-0035-4102-b6e7-e3757c2a296e-run-systemd\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.243173 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.243087 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a521797-0035-4102-b6e7-e3757c2a296e-run-openvswitch\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.243173 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.243099 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1a521797-0035-4102-b6e7-e3757c2a296e-host-run-netns\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.243173 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.243127 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1a521797-0035-4102-b6e7-e3757c2a296e-host-cni-netd\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.243173 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.243145 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1a521797-0035-4102-b6e7-e3757c2a296e-run-systemd\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.243173 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.243158 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pvxzw\" (UniqueName: \"kubernetes.io/projected/1a521797-0035-4102-b6e7-e3757c2a296e-kube-api-access-pvxzw\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.243663 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.243190 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1a521797-0035-4102-b6e7-e3757c2a296e-systemd-units\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.243663 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:45.243235 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:15:45.243663 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.243269 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1a521797-0035-4102-b6e7-e3757c2a296e-run-ovn\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.243663 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.243270 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1a521797-0035-4102-b6e7-e3757c2a296e-host-cni-netd\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.243663 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.243280 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1a521797-0035-4102-b6e7-e3757c2a296e-host-run-netns\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.243663 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.243234 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1a521797-0035-4102-b6e7-e3757c2a296e-run-ovn\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.243663 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:45.243307 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09653a58-e44e-4fb5-a021-58bc08a4765f-metrics-certs podName:09653a58-e44e-4fb5-a021-58bc08a4765f nodeName:}" failed. No retries permitted until 2026-04-28 19:15:45.743287022 +0000 UTC m=+3.143032101 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/09653a58-e44e-4fb5-a021-58bc08a4765f-metrics-certs") pod "network-metrics-daemon-88gvq" (UID: "09653a58-e44e-4fb5-a021-58bc08a4765f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:15:45.243663 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.243310 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1a521797-0035-4102-b6e7-e3757c2a296e-systemd-units\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.243663 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.243348 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1a521797-0035-4102-b6e7-e3757c2a296e-node-log\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.243663 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.243376 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a521797-0035-4102-b6e7-e3757c2a296e-host-run-ovn-kubernetes\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.243663 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.243402 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1a521797-0035-4102-b6e7-e3757c2a296e-env-overrides\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.243663 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.243408 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a521797-0035-4102-b6e7-e3757c2a296e-host-run-ovn-kubernetes\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.243663 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.243435 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/02955ab5-cc29-48ff-8727-30e4575778cb-iptables-alerter-script\") pod \"iptables-alerter-tntl8\" (UID: \"02955ab5-cc29-48ff-8727-30e4575778cb\") " pod="openshift-network-operator/iptables-alerter-tntl8" Apr 28 19:15:45.243663 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.243468 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qxhjx\" (UniqueName: \"kubernetes.io/projected/02955ab5-cc29-48ff-8727-30e4575778cb-kube-api-access-qxhjx\") pod \"iptables-alerter-tntl8\" (UID: \"02955ab5-cc29-48ff-8727-30e4575778cb\") " pod="openshift-network-operator/iptables-alerter-tntl8" Apr 28 19:15:45.243663 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.243499 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vvnf6\" (UniqueName: \"kubernetes.io/projected/09653a58-e44e-4fb5-a021-58bc08a4765f-kube-api-access-vvnf6\") pod \"network-metrics-daemon-88gvq\" (UID: \"09653a58-e44e-4fb5-a021-58bc08a4765f\") " pod="openshift-multus/network-metrics-daemon-88gvq" Apr 28 19:15:45.243663 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.243525 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1a521797-0035-4102-b6e7-e3757c2a296e-log-socket\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.243663 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.243530 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1a521797-0035-4102-b6e7-e3757c2a296e-node-log\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.244565 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.243536 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1a521797-0035-4102-b6e7-e3757c2a296e-ovnkube-config\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.244565 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.243549 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1a521797-0035-4102-b6e7-e3757c2a296e-host-cni-bin\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.244565 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.243598 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1a521797-0035-4102-b6e7-e3757c2a296e-host-cni-bin\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.244565 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.243602 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a521797-0035-4102-b6e7-e3757c2a296e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.244565 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.243653 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1a521797-0035-4102-b6e7-e3757c2a296e-log-socket\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.244565 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.243664 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1a521797-0035-4102-b6e7-e3757c2a296e-host-kubelet\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.244565 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.243690 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1a521797-0035-4102-b6e7-e3757c2a296e-ovn-node-metrics-cert\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.244565 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.243715 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/02955ab5-cc29-48ff-8727-30e4575778cb-host-slash\") pod \"iptables-alerter-tntl8\" (UID: \"02955ab5-cc29-48ff-8727-30e4575778cb\") " pod="openshift-network-operator/iptables-alerter-tntl8" Apr 28 19:15:45.244565 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.243743 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1a521797-0035-4102-b6e7-e3757c2a296e-host-slash\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.244565 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.243795 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1a521797-0035-4102-b6e7-e3757c2a296e-host-slash\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.244565 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.243795 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a521797-0035-4102-b6e7-e3757c2a296e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.244565 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.243798 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/02955ab5-cc29-48ff-8727-30e4575778cb-host-slash\") pod \"iptables-alerter-tntl8\" (UID: \"02955ab5-cc29-48ff-8727-30e4575778cb\") " pod="openshift-network-operator/iptables-alerter-tntl8" Apr 28 19:15:45.244565 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.243827 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a521797-0035-4102-b6e7-e3757c2a296e-etc-openvswitch\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.244565 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.243856 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1a521797-0035-4102-b6e7-e3757c2a296e-host-kubelet\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.244565 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.243881 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a521797-0035-4102-b6e7-e3757c2a296e-etc-openvswitch\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.244565 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.244032 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/02955ab5-cc29-48ff-8727-30e4575778cb-iptables-alerter-script\") pod \"iptables-alerter-tntl8\" (UID: \"02955ab5-cc29-48ff-8727-30e4575778cb\") " pod="openshift-network-operator/iptables-alerter-tntl8" Apr 28 19:15:45.244565 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.244223 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1a521797-0035-4102-b6e7-e3757c2a296e-env-overrides\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.245126 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.244322 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1a521797-0035-4102-b6e7-e3757c2a296e-ovnkube-script-lib\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.246226 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.246203 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1a521797-0035-4102-b6e7-e3757c2a296e-ovn-node-metrics-cert\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.266173 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.266126 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvnf6\" (UniqueName: \"kubernetes.io/projected/09653a58-e44e-4fb5-a021-58bc08a4765f-kube-api-access-vvnf6\") pod \"network-metrics-daemon-88gvq\" (UID: \"09653a58-e44e-4fb5-a021-58bc08a4765f\") " pod="openshift-multus/network-metrics-daemon-88gvq" Apr 28 19:15:45.272812 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.272784 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxhjx\" (UniqueName: \"kubernetes.io/projected/02955ab5-cc29-48ff-8727-30e4575778cb-kube-api-access-qxhjx\") pod \"iptables-alerter-tntl8\" (UID: \"02955ab5-cc29-48ff-8727-30e4575778cb\") " pod="openshift-network-operator/iptables-alerter-tntl8" Apr 28 19:15:45.275160 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.275138 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvxzw\" (UniqueName: \"kubernetes.io/projected/1a521797-0035-4102-b6e7-e3757c2a296e-kube-api-access-pvxzw\") pod \"ovnkube-node-d6twb\" (UID: \"1a521797-0035-4102-b6e7-e3757c2a296e\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.333141 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.333116 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pps95" Apr 28 19:15:45.341776 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.341755 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-h22m8" Apr 28 19:15:45.350246 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.350226 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-2cht2" Apr 28 19:15:45.355853 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.355832 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-clds8" Apr 28 19:15:45.363235 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.363218 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-dn4qh" Apr 28 19:15:45.368848 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.368828 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fqkh6" Apr 28 19:15:45.376341 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.376324 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ls4b6" Apr 28 19:15:45.382812 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.382797 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-tntl8" Apr 28 19:15:45.388369 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.388351 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:15:45.747514 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.747490 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdxgz\" (UniqueName: \"kubernetes.io/projected/40262fc5-4234-4213-8739-f1ff807f34ec-kube-api-access-kdxgz\") pod \"network-check-target-8gbf7\" (UID: \"40262fc5-4234-4213-8739-f1ff807f34ec\") " pod="openshift-network-diagnostics/network-check-target-8gbf7" Apr 28 19:15:45.747615 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:45.747543 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09653a58-e44e-4fb5-a021-58bc08a4765f-metrics-certs\") pod \"network-metrics-daemon-88gvq\" (UID: \"09653a58-e44e-4fb5-a021-58bc08a4765f\") " pod="openshift-multus/network-metrics-daemon-88gvq" Apr 28 19:15:45.747690 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:45.747630 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:15:45.747690 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:45.747654 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:15:45.747690 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:45.747674 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:15:45.747690 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:45.747686 2565 projected.go:194] Error preparing data for projected volume kube-api-access-kdxgz for pod openshift-network-diagnostics/network-check-target-8gbf7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:15:45.747845 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:45.747690 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09653a58-e44e-4fb5-a021-58bc08a4765f-metrics-certs podName:09653a58-e44e-4fb5-a021-58bc08a4765f nodeName:}" failed. No retries permitted until 2026-04-28 19:15:46.747676358 +0000 UTC m=+4.147421413 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/09653a58-e44e-4fb5-a021-58bc08a4765f-metrics-certs") pod "network-metrics-daemon-88gvq" (UID: "09653a58-e44e-4fb5-a021-58bc08a4765f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:15:45.747845 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:45.747758 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/40262fc5-4234-4213-8739-f1ff807f34ec-kube-api-access-kdxgz podName:40262fc5-4234-4213-8739-f1ff807f34ec nodeName:}" failed. No retries permitted until 2026-04-28 19:15:46.747732221 +0000 UTC m=+4.147477280 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-kdxgz" (UniqueName: "kubernetes.io/projected/40262fc5-4234-4213-8739-f1ff807f34ec-kube-api-access-kdxgz") pod "network-check-target-8gbf7" (UID: "40262fc5-4234-4213-8739-f1ff807f34ec") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:15:45.761250 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:45.761226 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd27f468f_a5ab_460e_8afc_5ff534c369dc.slice/crio-a1cbd1c046e6daf075f3b4f42de86c8ac34206602e33bc021ab61a5391ea26b9 WatchSource:0}: Error finding container a1cbd1c046e6daf075f3b4f42de86c8ac34206602e33bc021ab61a5391ea26b9: Status 404 returned error can't find the container with id a1cbd1c046e6daf075f3b4f42de86c8ac34206602e33bc021ab61a5391ea26b9 Apr 28 19:15:45.765022 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:45.765000 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02955ab5_cc29_48ff_8727_30e4575778cb.slice/crio-2ea3a741a11dce9a28ca40d0d0b5e0d635362e13bc534fade8fd93d8da792030 WatchSource:0}: Error finding container 2ea3a741a11dce9a28ca40d0d0b5e0d635362e13bc534fade8fd93d8da792030: Status 404 returned error can't find the container with id 2ea3a741a11dce9a28ca40d0d0b5e0d635362e13bc534fade8fd93d8da792030 Apr 28 19:15:45.768867 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:45.768844 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a28734c_75dc_4444_ae1f_b70d31a241e2.slice/crio-bf55f7ab0032341e7e7f18eecbbcf1d255db42ac71fed6fa2360cbfc99c227e8 WatchSource:0}: Error finding container bf55f7ab0032341e7e7f18eecbbcf1d255db42ac71fed6fa2360cbfc99c227e8: Status 404 returned error can't find the container with id bf55f7ab0032341e7e7f18eecbbcf1d255db42ac71fed6fa2360cbfc99c227e8 Apr 28 19:15:45.769780 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:45.769759 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b2a3928_8ff2_43ed_90d0_159376b2c777.slice/crio-beed7e4624bf091a09cfa5cd8f70ed3932313285d89ca27bb517e62647058e0e WatchSource:0}: Error finding container beed7e4624bf091a09cfa5cd8f70ed3932313285d89ca27bb517e62647058e0e: Status 404 returned error can't find the container with id beed7e4624bf091a09cfa5cd8f70ed3932313285d89ca27bb517e62647058e0e Apr 28 19:15:45.770692 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:45.770493 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde956b02_f02a_4203_a743_d9efee946739.slice/crio-4ad1a09459d66da9022dbc3ccd0bf2729e1f31c4d75600a72e76b59bc52d8d93 WatchSource:0}: Error finding container 4ad1a09459d66da9022dbc3ccd0bf2729e1f31c4d75600a72e76b59bc52d8d93: Status 404 returned error can't find the container with id 4ad1a09459d66da9022dbc3ccd0bf2729e1f31c4d75600a72e76b59bc52d8d93 Apr 28 19:15:45.771368 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:45.771326 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf56b141f_364e_495d_9046_30f1c93dbc83.slice/crio-58e6794dba60e07579ea4e60599c15febe6587db7815538c6c3dd2fd1e4f73a2 WatchSource:0}: Error finding container 58e6794dba60e07579ea4e60599c15febe6587db7815538c6c3dd2fd1e4f73a2: Status 404 returned error can't find the container with id 58e6794dba60e07579ea4e60599c15febe6587db7815538c6c3dd2fd1e4f73a2 Apr 28 19:15:45.773627 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:45.773569 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a521797_0035_4102_b6e7_e3757c2a296e.slice/crio-5156a22370d8ab4f2b810cf178fcef81966c2164f95166a6d19c89bce7d910a9 WatchSource:0}: Error finding container 5156a22370d8ab4f2b810cf178fcef81966c2164f95166a6d19c89bce7d910a9: Status 404 returned error can't find the container with id 5156a22370d8ab4f2b810cf178fcef81966c2164f95166a6d19c89bce7d910a9 Apr 28 19:15:45.774314 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:45.774291 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96de079e_3abf_48db_8ecf_bcd571c3ed27.slice/crio-631f04dd73ddd716fbf0834266e3854e94411458faf2cb1d54ac1c88ef07b499 WatchSource:0}: Error finding container 631f04dd73ddd716fbf0834266e3854e94411458faf2cb1d54ac1c88ef07b499: Status 404 returned error can't find the container with id 631f04dd73ddd716fbf0834266e3854e94411458faf2cb1d54ac1c88ef07b499 Apr 28 19:15:45.775994 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:15:45.775931 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod023534f3_e54d_45bb_b99b_12a35302ae01.slice/crio-f0e4edf45ea7add9f4d31ca147cb4326bc51996fede2287bae3c954ac4165a05 WatchSource:0}: Error finding container f0e4edf45ea7add9f4d31ca147cb4326bc51996fede2287bae3c954ac4165a05: Status 404 returned error can't find the container with id f0e4edf45ea7add9f4d31ca147cb4326bc51996fede2287bae3c954ac4165a05 Apr 28 19:15:46.045549 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:46.045486 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-ncgxg"] Apr 28 19:15:46.047021 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:46.047001 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ncgxg" Apr 28 19:15:46.047125 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:46.047078 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ncgxg" podUID="1059a8ad-7584-4de3-8259-c624717ec350" Apr 28 19:15:46.103500 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:46.103457 2565 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-27 19:10:44 +0000 UTC" deadline="2027-12-17 07:22:32.935907457 +0000 UTC" Apr 28 19:15:46.103500 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:46.103489 2565 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14340h6m46.832421895s" Apr 28 19:15:46.119778 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:46.119754 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8gbf7" Apr 28 19:15:46.119888 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:46.119866 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8gbf7" podUID="40262fc5-4234-4213-8739-f1ff807f34ec" Apr 28 19:15:46.128028 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:46.127048 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-119.ec2.internal" event={"ID":"9d1685eafe28745f79356d1935bdc8f9","Type":"ContainerStarted","Data":"f905f4526e89550938fbb31118cf12979b0c6ada3605a87f4ac74f0bf134630e"} Apr 28 19:15:46.130444 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:46.129016 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ls4b6" event={"ID":"f56b141f-364e-495d-9046-30f1c93dbc83","Type":"ContainerStarted","Data":"58e6794dba60e07579ea4e60599c15febe6587db7815538c6c3dd2fd1e4f73a2"} Apr 28 19:15:46.130444 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:46.130198 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fqkh6" event={"ID":"de956b02-f02a-4203-a743-d9efee946739","Type":"ContainerStarted","Data":"4ad1a09459d66da9022dbc3ccd0bf2729e1f31c4d75600a72e76b59bc52d8d93"} Apr 28 19:15:46.132479 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:46.132435 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-dn4qh" event={"ID":"4a28734c-75dc-4444-ae1f-b70d31a241e2","Type":"ContainerStarted","Data":"bf55f7ab0032341e7e7f18eecbbcf1d255db42ac71fed6fa2360cbfc99c227e8"} Apr 28 19:15:46.133507 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:46.133480 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-tntl8" event={"ID":"02955ab5-cc29-48ff-8727-30e4575778cb","Type":"ContainerStarted","Data":"2ea3a741a11dce9a28ca40d0d0b5e0d635362e13bc534fade8fd93d8da792030"} Apr 28 19:15:46.134616 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:46.134588 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-2cht2" event={"ID":"d27f468f-a5ab-460e-8afc-5ff534c369dc","Type":"ContainerStarted","Data":"a1cbd1c046e6daf075f3b4f42de86c8ac34206602e33bc021ab61a5391ea26b9"} Apr 28 19:15:46.136021 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:46.135999 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-clds8" event={"ID":"96de079e-3abf-48db-8ecf-bcd571c3ed27","Type":"ContainerStarted","Data":"631f04dd73ddd716fbf0834266e3854e94411458faf2cb1d54ac1c88ef07b499"} Apr 28 19:15:46.138736 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:46.138713 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" event={"ID":"1a521797-0035-4102-b6e7-e3757c2a296e","Type":"ContainerStarted","Data":"5156a22370d8ab4f2b810cf178fcef81966c2164f95166a6d19c89bce7d910a9"} Apr 28 19:15:46.140666 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:46.140638 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pps95" event={"ID":"023534f3-e54d-45bb-b99b-12a35302ae01","Type":"ContainerStarted","Data":"f0e4edf45ea7add9f4d31ca147cb4326bc51996fede2287bae3c954ac4165a05"} Apr 28 19:15:46.141945 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:46.141922 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-h22m8" event={"ID":"7b2a3928-8ff2-43ed-90d0-159376b2c777","Type":"ContainerStarted","Data":"beed7e4624bf091a09cfa5cd8f70ed3932313285d89ca27bb517e62647058e0e"} Apr 28 19:15:46.150972 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:46.150947 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1059a8ad-7584-4de3-8259-c624717ec350-original-pull-secret\") pod \"global-pull-secret-syncer-ncgxg\" (UID: \"1059a8ad-7584-4de3-8259-c624717ec350\") " pod="kube-system/global-pull-secret-syncer-ncgxg" Apr 28 19:15:46.151057 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:46.150998 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/1059a8ad-7584-4de3-8259-c624717ec350-kubelet-config\") pod \"global-pull-secret-syncer-ncgxg\" (UID: \"1059a8ad-7584-4de3-8259-c624717ec350\") " pod="kube-system/global-pull-secret-syncer-ncgxg" Apr 28 19:15:46.151057 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:46.151022 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/1059a8ad-7584-4de3-8259-c624717ec350-dbus\") pod \"global-pull-secret-syncer-ncgxg\" (UID: \"1059a8ad-7584-4de3-8259-c624717ec350\") " pod="kube-system/global-pull-secret-syncer-ncgxg" Apr 28 19:15:46.252135 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:46.252100 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/1059a8ad-7584-4de3-8259-c624717ec350-kubelet-config\") pod \"global-pull-secret-syncer-ncgxg\" (UID: \"1059a8ad-7584-4de3-8259-c624717ec350\") " pod="kube-system/global-pull-secret-syncer-ncgxg" Apr 28 19:15:46.252306 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:46.252148 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/1059a8ad-7584-4de3-8259-c624717ec350-dbus\") pod \"global-pull-secret-syncer-ncgxg\" (UID: \"1059a8ad-7584-4de3-8259-c624717ec350\") " pod="kube-system/global-pull-secret-syncer-ncgxg" Apr 28 19:15:46.252306 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:46.252239 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1059a8ad-7584-4de3-8259-c624717ec350-original-pull-secret\") pod \"global-pull-secret-syncer-ncgxg\" (UID: \"1059a8ad-7584-4de3-8259-c624717ec350\") " pod="kube-system/global-pull-secret-syncer-ncgxg" Apr 28 19:15:46.252418 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:46.252359 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 28 19:15:46.252469 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:46.252421 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1059a8ad-7584-4de3-8259-c624717ec350-original-pull-secret podName:1059a8ad-7584-4de3-8259-c624717ec350 nodeName:}" failed. No retries permitted until 2026-04-28 19:15:46.752402016 +0000 UTC m=+4.152147088 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1059a8ad-7584-4de3-8259-c624717ec350-original-pull-secret") pod "global-pull-secret-syncer-ncgxg" (UID: "1059a8ad-7584-4de3-8259-c624717ec350") : object "kube-system"/"original-pull-secret" not registered Apr 28 19:15:46.252695 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:46.252671 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/1059a8ad-7584-4de3-8259-c624717ec350-kubelet-config\") pod \"global-pull-secret-syncer-ncgxg\" (UID: \"1059a8ad-7584-4de3-8259-c624717ec350\") " pod="kube-system/global-pull-secret-syncer-ncgxg" Apr 28 19:15:46.252777 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:46.252743 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/1059a8ad-7584-4de3-8259-c624717ec350-dbus\") pod \"global-pull-secret-syncer-ncgxg\" (UID: \"1059a8ad-7584-4de3-8259-c624717ec350\") " pod="kube-system/global-pull-secret-syncer-ncgxg" Apr 28 19:15:46.760201 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:46.760168 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdxgz\" (UniqueName: \"kubernetes.io/projected/40262fc5-4234-4213-8739-f1ff807f34ec-kube-api-access-kdxgz\") pod \"network-check-target-8gbf7\" (UID: \"40262fc5-4234-4213-8739-f1ff807f34ec\") " pod="openshift-network-diagnostics/network-check-target-8gbf7" Apr 28 19:15:46.760355 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:46.760219 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1059a8ad-7584-4de3-8259-c624717ec350-original-pull-secret\") pod \"global-pull-secret-syncer-ncgxg\" (UID: \"1059a8ad-7584-4de3-8259-c624717ec350\") " pod="kube-system/global-pull-secret-syncer-ncgxg" Apr 28 19:15:46.760355 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:46.760266 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09653a58-e44e-4fb5-a021-58bc08a4765f-metrics-certs\") pod \"network-metrics-daemon-88gvq\" (UID: \"09653a58-e44e-4fb5-a021-58bc08a4765f\") " pod="openshift-multus/network-metrics-daemon-88gvq" Apr 28 19:15:46.760464 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:46.760379 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:15:46.760464 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:46.760439 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09653a58-e44e-4fb5-a021-58bc08a4765f-metrics-certs podName:09653a58-e44e-4fb5-a021-58bc08a4765f nodeName:}" failed. No retries permitted until 2026-04-28 19:15:48.760420817 +0000 UTC m=+6.160165892 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/09653a58-e44e-4fb5-a021-58bc08a4765f-metrics-certs") pod "network-metrics-daemon-88gvq" (UID: "09653a58-e44e-4fb5-a021-58bc08a4765f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:15:46.760988 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:46.760843 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:15:46.760988 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:46.760861 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:15:46.760988 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:46.760874 2565 projected.go:194] Error preparing data for projected volume kube-api-access-kdxgz for pod openshift-network-diagnostics/network-check-target-8gbf7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:15:46.760988 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:46.760940 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/40262fc5-4234-4213-8739-f1ff807f34ec-kube-api-access-kdxgz podName:40262fc5-4234-4213-8739-f1ff807f34ec nodeName:}" failed. No retries permitted until 2026-04-28 19:15:48.760925677 +0000 UTC m=+6.160670735 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-kdxgz" (UniqueName: "kubernetes.io/projected/40262fc5-4234-4213-8739-f1ff807f34ec-kube-api-access-kdxgz") pod "network-check-target-8gbf7" (UID: "40262fc5-4234-4213-8739-f1ff807f34ec") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:15:46.761252 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:46.761001 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 28 19:15:46.761252 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:46.761033 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1059a8ad-7584-4de3-8259-c624717ec350-original-pull-secret podName:1059a8ad-7584-4de3-8259-c624717ec350 nodeName:}" failed. No retries permitted until 2026-04-28 19:15:47.761022746 +0000 UTC m=+5.160767802 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1059a8ad-7584-4de3-8259-c624717ec350-original-pull-secret") pod "global-pull-secret-syncer-ncgxg" (UID: "1059a8ad-7584-4de3-8259-c624717ec350") : object "kube-system"/"original-pull-secret" not registered Apr 28 19:15:47.123655 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:47.122778 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-88gvq" Apr 28 19:15:47.123655 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:47.122927 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-88gvq" podUID="09653a58-e44e-4fb5-a021-58bc08a4765f" Apr 28 19:15:47.123655 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:47.123327 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ncgxg" Apr 28 19:15:47.123655 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:47.123421 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ncgxg" podUID="1059a8ad-7584-4de3-8259-c624717ec350" Apr 28 19:15:47.169607 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:47.169570 2565 generic.go:358] "Generic (PLEG): container finished" podID="8e7ef4154e4967cb8b4fb00da865259a" containerID="1f146300683a2cf2712df770ff8e3ae148b699c9497c0b4922c6089c984746fc" exitCode=0 Apr 28 19:15:47.169760 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:47.169691 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-119.ec2.internal" event={"ID":"8e7ef4154e4967cb8b4fb00da865259a","Type":"ContainerDied","Data":"1f146300683a2cf2712df770ff8e3ae148b699c9497c0b4922c6089c984746fc"} Apr 28 19:15:47.196786 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:47.196739 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-119.ec2.internal" podStartSLOduration=3.196719705 podStartE2EDuration="3.196719705s" podCreationTimestamp="2026-04-28 19:15:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:15:46.143742645 +0000 UTC m=+3.543487724" watchObservedRunningTime="2026-04-28 19:15:47.196719705 +0000 UTC m=+4.596464784" Apr 28 19:15:47.768811 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:47.768300 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1059a8ad-7584-4de3-8259-c624717ec350-original-pull-secret\") pod \"global-pull-secret-syncer-ncgxg\" (UID: \"1059a8ad-7584-4de3-8259-c624717ec350\") " pod="kube-system/global-pull-secret-syncer-ncgxg" Apr 28 19:15:47.768811 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:47.768434 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 28 19:15:47.768811 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:47.768493 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1059a8ad-7584-4de3-8259-c624717ec350-original-pull-secret podName:1059a8ad-7584-4de3-8259-c624717ec350 nodeName:}" failed. No retries permitted until 2026-04-28 19:15:49.768475036 +0000 UTC m=+7.168220107 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1059a8ad-7584-4de3-8259-c624717ec350-original-pull-secret") pod "global-pull-secret-syncer-ncgxg" (UID: "1059a8ad-7584-4de3-8259-c624717ec350") : object "kube-system"/"original-pull-secret" not registered Apr 28 19:15:48.119996 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:48.119926 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8gbf7" Apr 28 19:15:48.120137 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:48.120048 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8gbf7" podUID="40262fc5-4234-4213-8739-f1ff807f34ec" Apr 28 19:15:48.178643 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:48.178467 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-119.ec2.internal" event={"ID":"8e7ef4154e4967cb8b4fb00da865259a","Type":"ContainerStarted","Data":"cce113db68fec4262c70236a362741a340a3d0e2aa337f4bc8e0da71de54869e"} Apr 28 19:15:48.194287 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:48.194239 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-119.ec2.internal" podStartSLOduration=4.194221001 podStartE2EDuration="4.194221001s" podCreationTimestamp="2026-04-28 19:15:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:15:48.193583916 +0000 UTC m=+5.593328995" watchObservedRunningTime="2026-04-28 19:15:48.194221001 +0000 UTC m=+5.593966081" Apr 28 19:15:48.777543 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:48.777509 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09653a58-e44e-4fb5-a021-58bc08a4765f-metrics-certs\") pod \"network-metrics-daemon-88gvq\" (UID: \"09653a58-e44e-4fb5-a021-58bc08a4765f\") " pod="openshift-multus/network-metrics-daemon-88gvq" Apr 28 19:15:48.777674 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:48.777576 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdxgz\" (UniqueName: \"kubernetes.io/projected/40262fc5-4234-4213-8739-f1ff807f34ec-kube-api-access-kdxgz\") pod \"network-check-target-8gbf7\" (UID: \"40262fc5-4234-4213-8739-f1ff807f34ec\") " pod="openshift-network-diagnostics/network-check-target-8gbf7" Apr 28 19:15:48.777770 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:48.777752 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:15:48.777836 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:48.777777 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:15:48.777836 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:48.777791 2565 projected.go:194] Error preparing data for projected volume kube-api-access-kdxgz for pod openshift-network-diagnostics/network-check-target-8gbf7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:15:48.777957 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:48.777847 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/40262fc5-4234-4213-8739-f1ff807f34ec-kube-api-access-kdxgz podName:40262fc5-4234-4213-8739-f1ff807f34ec nodeName:}" failed. No retries permitted until 2026-04-28 19:15:52.77782895 +0000 UTC m=+10.177574026 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-kdxgz" (UniqueName: "kubernetes.io/projected/40262fc5-4234-4213-8739-f1ff807f34ec-kube-api-access-kdxgz") pod "network-check-target-8gbf7" (UID: "40262fc5-4234-4213-8739-f1ff807f34ec") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:15:48.778040 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:48.778017 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:15:48.778122 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:48.778093 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09653a58-e44e-4fb5-a021-58bc08a4765f-metrics-certs podName:09653a58-e44e-4fb5-a021-58bc08a4765f nodeName:}" failed. No retries permitted until 2026-04-28 19:15:52.778076458 +0000 UTC m=+10.177821529 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/09653a58-e44e-4fb5-a021-58bc08a4765f-metrics-certs") pod "network-metrics-daemon-88gvq" (UID: "09653a58-e44e-4fb5-a021-58bc08a4765f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:15:49.120570 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:49.120496 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ncgxg" Apr 28 19:15:49.120726 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:49.120628 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ncgxg" podUID="1059a8ad-7584-4de3-8259-c624717ec350" Apr 28 19:15:49.121023 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:49.120993 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-88gvq" Apr 28 19:15:49.121139 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:49.121101 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-88gvq" podUID="09653a58-e44e-4fb5-a021-58bc08a4765f" Apr 28 19:15:49.787553 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:49.786928 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1059a8ad-7584-4de3-8259-c624717ec350-original-pull-secret\") pod \"global-pull-secret-syncer-ncgxg\" (UID: \"1059a8ad-7584-4de3-8259-c624717ec350\") " pod="kube-system/global-pull-secret-syncer-ncgxg" Apr 28 19:15:49.787553 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:49.787090 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 28 19:15:49.787553 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:49.787175 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1059a8ad-7584-4de3-8259-c624717ec350-original-pull-secret podName:1059a8ad-7584-4de3-8259-c624717ec350 nodeName:}" failed. No retries permitted until 2026-04-28 19:15:53.787155969 +0000 UTC m=+11.186901042 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1059a8ad-7584-4de3-8259-c624717ec350-original-pull-secret") pod "global-pull-secret-syncer-ncgxg" (UID: "1059a8ad-7584-4de3-8259-c624717ec350") : object "kube-system"/"original-pull-secret" not registered Apr 28 19:15:50.119659 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:50.119575 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8gbf7" Apr 28 19:15:50.119801 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:50.119737 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8gbf7" podUID="40262fc5-4234-4213-8739-f1ff807f34ec" Apr 28 19:15:51.120453 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:51.120420 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-88gvq" Apr 28 19:15:51.120453 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:51.120466 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ncgxg" Apr 28 19:15:51.121032 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:51.120558 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-88gvq" podUID="09653a58-e44e-4fb5-a021-58bc08a4765f" Apr 28 19:15:51.121032 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:51.120599 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ncgxg" podUID="1059a8ad-7584-4de3-8259-c624717ec350" Apr 28 19:15:52.120061 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:52.119874 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8gbf7" Apr 28 19:15:52.120061 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:52.120011 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8gbf7" podUID="40262fc5-4234-4213-8739-f1ff807f34ec" Apr 28 19:15:52.808394 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:52.808353 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09653a58-e44e-4fb5-a021-58bc08a4765f-metrics-certs\") pod \"network-metrics-daemon-88gvq\" (UID: \"09653a58-e44e-4fb5-a021-58bc08a4765f\") " pod="openshift-multus/network-metrics-daemon-88gvq" Apr 28 19:15:52.808873 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:52.808419 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdxgz\" (UniqueName: \"kubernetes.io/projected/40262fc5-4234-4213-8739-f1ff807f34ec-kube-api-access-kdxgz\") pod \"network-check-target-8gbf7\" (UID: \"40262fc5-4234-4213-8739-f1ff807f34ec\") " pod="openshift-network-diagnostics/network-check-target-8gbf7" Apr 28 19:15:52.808873 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:52.808583 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:15:52.808873 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:52.808601 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:15:52.808873 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:52.808613 2565 projected.go:194] Error preparing data for projected volume kube-api-access-kdxgz for pod openshift-network-diagnostics/network-check-target-8gbf7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:15:52.808873 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:52.808668 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/40262fc5-4234-4213-8739-f1ff807f34ec-kube-api-access-kdxgz podName:40262fc5-4234-4213-8739-f1ff807f34ec nodeName:}" failed. No retries permitted until 2026-04-28 19:16:00.808650069 +0000 UTC m=+18.208395128 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-kdxgz" (UniqueName: "kubernetes.io/projected/40262fc5-4234-4213-8739-f1ff807f34ec-kube-api-access-kdxgz") pod "network-check-target-8gbf7" (UID: "40262fc5-4234-4213-8739-f1ff807f34ec") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:15:52.809196 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:52.809071 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:15:52.809196 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:52.809122 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09653a58-e44e-4fb5-a021-58bc08a4765f-metrics-certs podName:09653a58-e44e-4fb5-a021-58bc08a4765f nodeName:}" failed. No retries permitted until 2026-04-28 19:16:00.809107186 +0000 UTC m=+18.208852246 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/09653a58-e44e-4fb5-a021-58bc08a4765f-metrics-certs") pod "network-metrics-daemon-88gvq" (UID: "09653a58-e44e-4fb5-a021-58bc08a4765f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:15:53.122251 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:53.121399 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-88gvq" Apr 28 19:15:53.122251 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:53.121698 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-88gvq" podUID="09653a58-e44e-4fb5-a021-58bc08a4765f" Apr 28 19:15:53.122251 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:53.122077 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ncgxg" Apr 28 19:15:53.122251 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:53.122170 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ncgxg" podUID="1059a8ad-7584-4de3-8259-c624717ec350" Apr 28 19:15:53.816391 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:53.816355 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1059a8ad-7584-4de3-8259-c624717ec350-original-pull-secret\") pod \"global-pull-secret-syncer-ncgxg\" (UID: \"1059a8ad-7584-4de3-8259-c624717ec350\") " pod="kube-system/global-pull-secret-syncer-ncgxg" Apr 28 19:15:53.816874 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:53.816503 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 28 19:15:53.816874 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:53.816561 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1059a8ad-7584-4de3-8259-c624717ec350-original-pull-secret podName:1059a8ad-7584-4de3-8259-c624717ec350 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:01.816544082 +0000 UTC m=+19.216289143 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1059a8ad-7584-4de3-8259-c624717ec350-original-pull-secret") pod "global-pull-secret-syncer-ncgxg" (UID: "1059a8ad-7584-4de3-8259-c624717ec350") : object "kube-system"/"original-pull-secret" not registered Apr 28 19:15:54.120553 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:54.120450 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8gbf7" Apr 28 19:15:54.120759 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:54.120579 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8gbf7" podUID="40262fc5-4234-4213-8739-f1ff807f34ec" Apr 28 19:15:55.120067 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:55.120037 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ncgxg" Apr 28 19:15:55.120067 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:55.120050 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-88gvq" Apr 28 19:15:55.120504 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:55.120153 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ncgxg" podUID="1059a8ad-7584-4de3-8259-c624717ec350" Apr 28 19:15:55.120504 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:55.120292 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-88gvq" podUID="09653a58-e44e-4fb5-a021-58bc08a4765f" Apr 28 19:15:56.120551 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:56.120525 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8gbf7" Apr 28 19:15:56.121006 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:56.120646 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8gbf7" podUID="40262fc5-4234-4213-8739-f1ff807f34ec" Apr 28 19:15:57.119602 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:57.119568 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-88gvq" Apr 28 19:15:57.119602 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:57.119581 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ncgxg" Apr 28 19:15:57.119831 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:57.119692 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-88gvq" podUID="09653a58-e44e-4fb5-a021-58bc08a4765f" Apr 28 19:15:57.119885 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:57.119817 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ncgxg" podUID="1059a8ad-7584-4de3-8259-c624717ec350" Apr 28 19:15:58.120263 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:58.120221 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8gbf7" Apr 28 19:15:58.120671 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:58.120331 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8gbf7" podUID="40262fc5-4234-4213-8739-f1ff807f34ec" Apr 28 19:15:59.120066 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:59.120029 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-88gvq" Apr 28 19:15:59.120245 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:59.120164 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-88gvq" podUID="09653a58-e44e-4fb5-a021-58bc08a4765f" Apr 28 19:15:59.120511 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:15:59.120487 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ncgxg" Apr 28 19:15:59.120857 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:15:59.120574 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ncgxg" podUID="1059a8ad-7584-4de3-8259-c624717ec350" Apr 28 19:16:00.120294 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:00.120260 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8gbf7" Apr 28 19:16:00.120514 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:00.120382 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8gbf7" podUID="40262fc5-4234-4213-8739-f1ff807f34ec" Apr 28 19:16:00.871087 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:00.871056 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09653a58-e44e-4fb5-a021-58bc08a4765f-metrics-certs\") pod \"network-metrics-daemon-88gvq\" (UID: \"09653a58-e44e-4fb5-a021-58bc08a4765f\") " pod="openshift-multus/network-metrics-daemon-88gvq" Apr 28 19:16:00.871468 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:00.871103 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdxgz\" (UniqueName: \"kubernetes.io/projected/40262fc5-4234-4213-8739-f1ff807f34ec-kube-api-access-kdxgz\") pod \"network-check-target-8gbf7\" (UID: \"40262fc5-4234-4213-8739-f1ff807f34ec\") " pod="openshift-network-diagnostics/network-check-target-8gbf7" Apr 28 19:16:00.871468 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:00.871192 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:00.871468 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:00.871206 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:00.871468 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:00.871235 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:00.871468 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:00.871249 2565 projected.go:194] Error preparing data for projected volume kube-api-access-kdxgz for pod openshift-network-diagnostics/network-check-target-8gbf7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:00.871468 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:00.871273 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09653a58-e44e-4fb5-a021-58bc08a4765f-metrics-certs podName:09653a58-e44e-4fb5-a021-58bc08a4765f nodeName:}" failed. No retries permitted until 2026-04-28 19:16:16.871252543 +0000 UTC m=+34.270997605 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/09653a58-e44e-4fb5-a021-58bc08a4765f-metrics-certs") pod "network-metrics-daemon-88gvq" (UID: "09653a58-e44e-4fb5-a021-58bc08a4765f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:00.871468 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:00.871295 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/40262fc5-4234-4213-8739-f1ff807f34ec-kube-api-access-kdxgz podName:40262fc5-4234-4213-8739-f1ff807f34ec nodeName:}" failed. No retries permitted until 2026-04-28 19:16:16.871283372 +0000 UTC m=+34.271028430 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-kdxgz" (UniqueName: "kubernetes.io/projected/40262fc5-4234-4213-8739-f1ff807f34ec-kube-api-access-kdxgz") pod "network-check-target-8gbf7" (UID: "40262fc5-4234-4213-8739-f1ff807f34ec") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:01.120046 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:01.120011 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-88gvq" Apr 28 19:16:01.120216 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:01.120140 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-88gvq" podUID="09653a58-e44e-4fb5-a021-58bc08a4765f" Apr 28 19:16:01.120282 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:01.120213 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ncgxg" Apr 28 19:16:01.120325 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:01.120313 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ncgxg" podUID="1059a8ad-7584-4de3-8259-c624717ec350" Apr 28 19:16:01.878778 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:01.878737 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1059a8ad-7584-4de3-8259-c624717ec350-original-pull-secret\") pod \"global-pull-secret-syncer-ncgxg\" (UID: \"1059a8ad-7584-4de3-8259-c624717ec350\") " pod="kube-system/global-pull-secret-syncer-ncgxg" Apr 28 19:16:01.879229 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:01.878891 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:01.879229 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:01.878989 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1059a8ad-7584-4de3-8259-c624717ec350-original-pull-secret podName:1059a8ad-7584-4de3-8259-c624717ec350 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:17.878968815 +0000 UTC m=+35.278713872 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1059a8ad-7584-4de3-8259-c624717ec350-original-pull-secret") pod "global-pull-secret-syncer-ncgxg" (UID: "1059a8ad-7584-4de3-8259-c624717ec350") : object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:02.119824 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:02.119791 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8gbf7" Apr 28 19:16:02.119993 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:02.119926 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8gbf7" podUID="40262fc5-4234-4213-8739-f1ff807f34ec" Apr 28 19:16:03.120801 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:03.120768 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ncgxg" Apr 28 19:16:03.121326 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:03.120852 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ncgxg" podUID="1059a8ad-7584-4de3-8259-c624717ec350" Apr 28 19:16:03.121326 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:03.120940 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-88gvq" Apr 28 19:16:03.121326 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:03.121033 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-88gvq" podUID="09653a58-e44e-4fb5-a021-58bc08a4765f" Apr 28 19:16:04.120523 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:04.120491 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8gbf7" Apr 28 19:16:04.120659 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:04.120617 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8gbf7" podUID="40262fc5-4234-4213-8739-f1ff807f34ec" Apr 28 19:16:04.205325 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:04.205297 2565 generic.go:358] "Generic (PLEG): container finished" podID="f56b141f-364e-495d-9046-30f1c93dbc83" containerID="8c16e22a67ca0b99e2a891b8bb3dd3c4ab68923eabb99aa75fcfeb382ab3789d" exitCode=0 Apr 28 19:16:04.206086 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:04.205368 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ls4b6" event={"ID":"f56b141f-364e-495d-9046-30f1c93dbc83","Type":"ContainerDied","Data":"8c16e22a67ca0b99e2a891b8bb3dd3c4ab68923eabb99aa75fcfeb382ab3789d"} Apr 28 19:16:04.207361 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:04.207329 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fqkh6" event={"ID":"de956b02-f02a-4203-a743-d9efee946739","Type":"ContainerStarted","Data":"4072b5570e19dc45adbf1b4fe973dc441991ba4869b92ab7ae7b899344d534f0"} Apr 28 19:16:04.208906 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:04.208866 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-dn4qh" event={"ID":"4a28734c-75dc-4444-ae1f-b70d31a241e2","Type":"ContainerStarted","Data":"145c9afe1b92c637c7930d003ad4c45399c9fc956d4979307a97e85e3c525e01"} Apr 28 19:16:04.210295 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:04.210269 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-2cht2" event={"ID":"d27f468f-a5ab-460e-8afc-5ff534c369dc","Type":"ContainerStarted","Data":"9b8480dd0fa48b78ce86eef70cec08e00028840c732cb06e3fbe4617376c12e7"} Apr 28 19:16:04.211700 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:04.211678 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-clds8" event={"ID":"96de079e-3abf-48db-8ecf-bcd571c3ed27","Type":"ContainerStarted","Data":"524d912ab93e4fd218b513aeecf8dfdf0b8656be76a6816c4baa9ca01961f3d9"} Apr 28 19:16:04.214863 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:04.214842 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" event={"ID":"1a521797-0035-4102-b6e7-e3757c2a296e","Type":"ContainerStarted","Data":"63784e03eb5c40dd35dd95c5d2993d4a4d6554db35c14f7285b89ffd42bda12c"} Apr 28 19:16:04.214950 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:04.214872 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" event={"ID":"1a521797-0035-4102-b6e7-e3757c2a296e","Type":"ContainerStarted","Data":"593885ae6ac15b6dad1271e8694d2af2fc2d94d06d9388f389d28b87e311b4b3"} Apr 28 19:16:04.214950 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:04.214887 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" event={"ID":"1a521797-0035-4102-b6e7-e3757c2a296e","Type":"ContainerStarted","Data":"75b3ae097e682803f6eac5a9fad2ab37edd879e351d79ce24891ac33ef6250cd"} Apr 28 19:16:04.214950 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:04.214921 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" event={"ID":"1a521797-0035-4102-b6e7-e3757c2a296e","Type":"ContainerStarted","Data":"87b249e3e4c206ea31fe907b2469030fe65018c0410d6566f4f52edb9d90b421"} Apr 28 19:16:04.214950 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:04.214935 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" event={"ID":"1a521797-0035-4102-b6e7-e3757c2a296e","Type":"ContainerStarted","Data":"152d49cb9a0b780558629e72f65fd33771a5d5072169245d3df52932596caba9"} Apr 28 19:16:04.214950 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:04.214949 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" event={"ID":"1a521797-0035-4102-b6e7-e3757c2a296e","Type":"ContainerStarted","Data":"3714d498dbd26174df4bda6b185c6e210ee73f03fa6b5f3959c1b0c73624069d"} Apr 28 19:16:04.216112 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:04.216091 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pps95" event={"ID":"023534f3-e54d-45bb-b99b-12a35302ae01","Type":"ContainerStarted","Data":"3f35698103a3783511903eff404bb4b538255f9cf58eb6476a0b37fa33a5e720"} Apr 28 19:16:04.217316 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:04.217300 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-h22m8" event={"ID":"7b2a3928-8ff2-43ed-90d0-159376b2c777","Type":"ContainerStarted","Data":"a44a215b91d15b8751a1fb753e9b0fa58894173aa4cac44d56fea4fee91d7e0c"} Apr 28 19:16:04.241201 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:04.241155 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-2cht2" podStartSLOduration=3.82162693 podStartE2EDuration="21.241140625s" podCreationTimestamp="2026-04-28 19:15:43 +0000 UTC" firstStartedPulling="2026-04-28 19:15:45.762762112 +0000 UTC m=+3.162507171" lastFinishedPulling="2026-04-28 19:16:03.182275797 +0000 UTC m=+20.582020866" observedRunningTime="2026-04-28 19:16:04.240663972 +0000 UTC m=+21.640409052" watchObservedRunningTime="2026-04-28 19:16:04.241140625 +0000 UTC m=+21.640885703" Apr 28 19:16:04.253571 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:04.253529 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-clds8" podStartSLOduration=3.850179019 podStartE2EDuration="21.253495948s" podCreationTimestamp="2026-04-28 19:15:43 +0000 UTC" firstStartedPulling="2026-04-28 19:15:45.778814216 +0000 UTC m=+3.178559271" lastFinishedPulling="2026-04-28 19:16:03.182131142 +0000 UTC m=+20.581876200" observedRunningTime="2026-04-28 19:16:04.253010889 +0000 UTC m=+21.652755968" watchObservedRunningTime="2026-04-28 19:16:04.253495948 +0000 UTC m=+21.653241026" Apr 28 19:16:04.266853 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:04.266822 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-dn4qh" podStartSLOduration=3.855530797 podStartE2EDuration="21.266812814s" podCreationTimestamp="2026-04-28 19:15:43 +0000 UTC" firstStartedPulling="2026-04-28 19:15:45.77084999 +0000 UTC m=+3.170595060" lastFinishedPulling="2026-04-28 19:16:03.182132007 +0000 UTC m=+20.581877077" observedRunningTime="2026-04-28 19:16:04.266588029 +0000 UTC m=+21.666333108" watchObservedRunningTime="2026-04-28 19:16:04.266812814 +0000 UTC m=+21.666557892" Apr 28 19:16:04.328073 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:04.327885 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-pps95" podStartSLOduration=3.89356516 podStartE2EDuration="21.327871528s" podCreationTimestamp="2026-04-28 19:15:43 +0000 UTC" firstStartedPulling="2026-04-28 19:15:45.778827578 +0000 UTC m=+3.178572634" lastFinishedPulling="2026-04-28 19:16:03.213133946 +0000 UTC m=+20.612879002" observedRunningTime="2026-04-28 19:16:04.327690274 +0000 UTC m=+21.727435342" watchObservedRunningTime="2026-04-28 19:16:04.327871528 +0000 UTC m=+21.727616607" Apr 28 19:16:04.328207 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:04.328185 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-fqkh6" podStartSLOduration=3.918743454 podStartE2EDuration="21.328176842s" podCreationTimestamp="2026-04-28 19:15:43 +0000 UTC" firstStartedPulling="2026-04-28 19:15:45.772684368 +0000 UTC m=+3.172429435" lastFinishedPulling="2026-04-28 19:16:03.182117759 +0000 UTC m=+20.581862823" observedRunningTime="2026-04-28 19:16:04.291793343 +0000 UTC m=+21.691538422" watchObservedRunningTime="2026-04-28 19:16:04.328176842 +0000 UTC m=+21.727921960" Apr 28 19:16:04.535203 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:04.535170 2565 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 28 19:16:05.103740 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:05.103605 2565 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-28T19:16:04.535190449Z","UUID":"22cef679-309c-48af-951b-3ecdd48ddb80","Handler":null,"Name":"","Endpoint":""} Apr 28 19:16:05.105600 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:05.105575 2565 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 28 19:16:05.105600 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:05.105605 2565 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 28 19:16:05.120324 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:05.120297 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-88gvq" Apr 28 19:16:05.120426 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:05.120327 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ncgxg" Apr 28 19:16:05.120466 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:05.120439 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-88gvq" podUID="09653a58-e44e-4fb5-a021-58bc08a4765f" Apr 28 19:16:05.120556 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:05.120540 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ncgxg" podUID="1059a8ad-7584-4de3-8259-c624717ec350" Apr 28 19:16:05.221375 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:05.221341 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-tntl8" event={"ID":"02955ab5-cc29-48ff-8727-30e4575778cb","Type":"ContainerStarted","Data":"00b63a74df97d927cc01f75bca5144e3fad3756be7405aa8fdd3277a21e8e454"} Apr 28 19:16:05.223399 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:05.223371 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-h22m8" event={"ID":"7b2a3928-8ff2-43ed-90d0-159376b2c777","Type":"ContainerStarted","Data":"dd130f19003c52ef69173f40d4eae8774e74a63cd83b3f60d24003d3da1f1581"} Apr 28 19:16:05.322768 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:05.322736 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-dn4qh" Apr 28 19:16:05.323234 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:05.323214 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-dn4qh" Apr 28 19:16:05.342169 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:05.342120 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-tntl8" podStartSLOduration=4.926994427 podStartE2EDuration="22.342103893s" podCreationTimestamp="2026-04-28 19:15:43 +0000 UTC" firstStartedPulling="2026-04-28 19:15:45.767151516 +0000 UTC m=+3.166896572" lastFinishedPulling="2026-04-28 19:16:03.182260978 +0000 UTC m=+20.582006038" observedRunningTime="2026-04-28 19:16:05.246716788 +0000 UTC m=+22.646461866" watchObservedRunningTime="2026-04-28 19:16:05.342103893 +0000 UTC m=+22.741848972" Apr 28 19:16:06.120261 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:06.120186 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8gbf7" Apr 28 19:16:06.120389 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:06.120301 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8gbf7" podUID="40262fc5-4234-4213-8739-f1ff807f34ec" Apr 28 19:16:06.228681 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:06.228640 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" event={"ID":"1a521797-0035-4102-b6e7-e3757c2a296e","Type":"ContainerStarted","Data":"e2ebc5600a728ace32d7981e2293d76389a7268bad69bd38628e9ceecfd33bce"} Apr 28 19:16:06.230810 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:06.230782 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-h22m8" event={"ID":"7b2a3928-8ff2-43ed-90d0-159376b2c777","Type":"ContainerStarted","Data":"ac4dc83c67e9dd3c51026bf433c7c0fac5d207bb8f4a2054ce40be263b04e95a"} Apr 28 19:16:06.231345 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:06.231321 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-dn4qh" Apr 28 19:16:06.231528 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:06.231515 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-dn4qh" Apr 28 19:16:06.248561 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:06.248510 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-h22m8" podStartSLOduration=3.703955156 podStartE2EDuration="23.24849848s" podCreationTimestamp="2026-04-28 19:15:43 +0000 UTC" firstStartedPulling="2026-04-28 19:15:45.77200883 +0000 UTC m=+3.171753900" lastFinishedPulling="2026-04-28 19:16:05.316552151 +0000 UTC m=+22.716297224" observedRunningTime="2026-04-28 19:16:06.248278984 +0000 UTC m=+23.648024063" watchObservedRunningTime="2026-04-28 19:16:06.24849848 +0000 UTC m=+23.648243565" Apr 28 19:16:07.120376 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:07.120341 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-88gvq" Apr 28 19:16:07.120541 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:07.120352 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ncgxg" Apr 28 19:16:07.120541 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:07.120476 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-88gvq" podUID="09653a58-e44e-4fb5-a021-58bc08a4765f" Apr 28 19:16:07.120541 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:07.120530 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ncgxg" podUID="1059a8ad-7584-4de3-8259-c624717ec350" Apr 28 19:16:08.120454 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:08.120290 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8gbf7" Apr 28 19:16:08.120804 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:08.120557 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8gbf7" podUID="40262fc5-4234-4213-8739-f1ff807f34ec" Apr 28 19:16:08.237500 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:08.237443 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" event={"ID":"1a521797-0035-4102-b6e7-e3757c2a296e","Type":"ContainerStarted","Data":"3a8335c6425c843dfbd6afe46519b10e2c111e0b6d520351d23c5cb88a439d93"} Apr 28 19:16:08.237709 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:08.237679 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:16:08.237709 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:08.237704 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:16:08.251211 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:08.251186 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:16:08.251599 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:08.251585 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:16:08.267568 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:08.267530 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" podStartSLOduration=7.735321961 podStartE2EDuration="25.267518985s" podCreationTimestamp="2026-04-28 19:15:43 +0000 UTC" firstStartedPulling="2026-04-28 19:15:45.776686138 +0000 UTC m=+3.176431199" lastFinishedPulling="2026-04-28 19:16:03.308883154 +0000 UTC m=+20.708628223" observedRunningTime="2026-04-28 19:16:08.267315741 +0000 UTC m=+25.667060818" watchObservedRunningTime="2026-04-28 19:16:08.267518985 +0000 UTC m=+25.667264062" Apr 28 19:16:09.119654 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:09.119625 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-88gvq" Apr 28 19:16:09.119927 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:09.119625 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ncgxg" Apr 28 19:16:09.119927 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:09.119717 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-88gvq" podUID="09653a58-e44e-4fb5-a021-58bc08a4765f" Apr 28 19:16:09.119927 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:09.119776 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ncgxg" podUID="1059a8ad-7584-4de3-8259-c624717ec350" Apr 28 19:16:09.240444 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:09.240410 2565 generic.go:358] "Generic (PLEG): container finished" podID="f56b141f-364e-495d-9046-30f1c93dbc83" containerID="f79d69469674e5230e0337a43a5a4ec296836f07596f6db9a74e5f1c4a368620" exitCode=0 Apr 28 19:16:09.240955 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:09.240465 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ls4b6" event={"ID":"f56b141f-364e-495d-9046-30f1c93dbc83","Type":"ContainerDied","Data":"f79d69469674e5230e0337a43a5a4ec296836f07596f6db9a74e5f1c4a368620"} Apr 28 19:16:09.240955 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:09.240576 2565 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 28 19:16:10.120085 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:10.120059 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8gbf7" Apr 28 19:16:10.120201 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:10.120180 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8gbf7" podUID="40262fc5-4234-4213-8739-f1ff807f34ec" Apr 28 19:16:10.244244 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:10.244218 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ls4b6" event={"ID":"f56b141f-364e-495d-9046-30f1c93dbc83","Type":"ContainerStarted","Data":"e90300c98cbc3a634ec48a8d6bb937d520ba52ea883620d9d83826280b738373"} Apr 28 19:16:10.244499 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:10.244308 2565 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 28 19:16:10.297435 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:10.297406 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-88gvq"] Apr 28 19:16:10.297554 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:10.297534 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-88gvq" Apr 28 19:16:10.297644 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:10.297627 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-88gvq" podUID="09653a58-e44e-4fb5-a021-58bc08a4765f" Apr 28 19:16:10.300917 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:10.300807 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-ncgxg"] Apr 28 19:16:10.301036 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:10.300920 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ncgxg" Apr 28 19:16:10.301036 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:10.301002 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ncgxg" podUID="1059a8ad-7584-4de3-8259-c624717ec350" Apr 28 19:16:10.303849 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:10.303827 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-8gbf7"] Apr 28 19:16:10.303972 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:10.303921 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8gbf7" Apr 28 19:16:10.304037 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:10.304006 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8gbf7" podUID="40262fc5-4234-4213-8739-f1ff807f34ec" Apr 28 19:16:11.247637 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:11.247474 2565 generic.go:358] "Generic (PLEG): container finished" podID="f56b141f-364e-495d-9046-30f1c93dbc83" containerID="e90300c98cbc3a634ec48a8d6bb937d520ba52ea883620d9d83826280b738373" exitCode=0 Apr 28 19:16:11.247637 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:11.247554 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ls4b6" event={"ID":"f56b141f-364e-495d-9046-30f1c93dbc83","Type":"ContainerDied","Data":"e90300c98cbc3a634ec48a8d6bb937d520ba52ea883620d9d83826280b738373"} Apr 28 19:16:11.271861 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:11.271836 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:16:12.119718 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:12.119691 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ncgxg" Apr 28 19:16:12.119843 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:12.119749 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-88gvq" Apr 28 19:16:12.119843 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:12.119775 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8gbf7" Apr 28 19:16:12.119965 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:12.119886 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-88gvq" podUID="09653a58-e44e-4fb5-a021-58bc08a4765f" Apr 28 19:16:12.119965 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:12.119927 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8gbf7" podUID="40262fc5-4234-4213-8739-f1ff807f34ec" Apr 28 19:16:12.119965 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:12.119959 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ncgxg" podUID="1059a8ad-7584-4de3-8259-c624717ec350" Apr 28 19:16:12.251619 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:12.251594 2565 generic.go:358] "Generic (PLEG): container finished" podID="f56b141f-364e-495d-9046-30f1c93dbc83" containerID="64c16d038798930da45bc0385267c02d3f25e248e3132123bf210f318f5b0f20" exitCode=0 Apr 28 19:16:12.251929 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:12.251643 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ls4b6" event={"ID":"f56b141f-364e-495d-9046-30f1c93dbc83","Type":"ContainerDied","Data":"64c16d038798930da45bc0385267c02d3f25e248e3132123bf210f318f5b0f20"} Apr 28 19:16:14.119658 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:14.119628 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-88gvq" Apr 28 19:16:14.119658 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:14.119639 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8gbf7" Apr 28 19:16:14.120353 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:14.119737 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-88gvq" podUID="09653a58-e44e-4fb5-a021-58bc08a4765f" Apr 28 19:16:14.120353 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:14.119749 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ncgxg" Apr 28 19:16:14.120353 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:14.119874 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8gbf7" podUID="40262fc5-4234-4213-8739-f1ff807f34ec" Apr 28 19:16:14.120353 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:14.119981 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ncgxg" podUID="1059a8ad-7584-4de3-8259-c624717ec350" Apr 28 19:16:16.120584 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.120542 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ncgxg" Apr 28 19:16:16.121051 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.120666 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8gbf7" Apr 28 19:16:16.121051 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.120690 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-88gvq" Apr 28 19:16:16.121051 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:16.120770 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8gbf7" podUID="40262fc5-4234-4213-8739-f1ff807f34ec" Apr 28 19:16:16.121051 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:16.120665 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ncgxg" podUID="1059a8ad-7584-4de3-8259-c624717ec350" Apr 28 19:16:16.121051 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:16.120875 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-88gvq" podUID="09653a58-e44e-4fb5-a021-58bc08a4765f" Apr 28 19:16:16.420003 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.419979 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-119.ec2.internal" event="NodeReady" Apr 28 19:16:16.420163 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.420096 2565 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 28 19:16:16.486467 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.486437 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7c9bd88988-mljkv"] Apr 28 19:16:16.507101 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.507076 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7c9bd88988-mljkv" Apr 28 19:16:16.508783 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.508762 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-qfmts"] Apr 28 19:16:16.510055 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.510035 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 28 19:16:16.510402 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.510349 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-lcrng\"" Apr 28 19:16:16.510651 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.510581 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 28 19:16:16.510651 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.510599 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 28 19:16:16.521332 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.521309 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 28 19:16:16.524678 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.524650 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7c9bd88988-mljkv"] Apr 28 19:16:16.524751 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.524681 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qfmts"] Apr 28 19:16:16.524803 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.524761 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qfmts" Apr 28 19:16:16.527060 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.527041 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 28 19:16:16.527180 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.527156 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 28 19:16:16.527180 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.527174 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-rkrph\"" Apr 28 19:16:16.527333 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.527242 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 28 19:16:16.603517 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.603471 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-hwwt4"] Apr 28 19:16:16.612444 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.612416 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hwwt4" Apr 28 19:16:16.615195 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.615167 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 28 19:16:16.615312 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.615180 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 28 19:16:16.615312 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.615224 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-qcg5d\"" Apr 28 19:16:16.618722 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.618703 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hwwt4"] Apr 28 19:16:16.695580 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.695506 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/68018594-cd79-418d-92f7-ff2244ebff00-ca-trust-extracted\") pod \"image-registry-7c9bd88988-mljkv\" (UID: \"68018594-cd79-418d-92f7-ff2244ebff00\") " pod="openshift-image-registry/image-registry-7c9bd88988-mljkv" Apr 28 19:16:16.695580 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.695555 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/68018594-cd79-418d-92f7-ff2244ebff00-registry-tls\") pod \"image-registry-7c9bd88988-mljkv\" (UID: \"68018594-cd79-418d-92f7-ff2244ebff00\") " pod="openshift-image-registry/image-registry-7c9bd88988-mljkv" Apr 28 19:16:16.695778 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.695601 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68018594-cd79-418d-92f7-ff2244ebff00-trusted-ca\") pod \"image-registry-7c9bd88988-mljkv\" (UID: \"68018594-cd79-418d-92f7-ff2244ebff00\") " pod="openshift-image-registry/image-registry-7c9bd88988-mljkv" Apr 28 19:16:16.695778 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.695685 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/68018594-cd79-418d-92f7-ff2244ebff00-image-registry-private-configuration\") pod \"image-registry-7c9bd88988-mljkv\" (UID: \"68018594-cd79-418d-92f7-ff2244ebff00\") " pod="openshift-image-registry/image-registry-7c9bd88988-mljkv" Apr 28 19:16:16.695778 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.695711 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/68018594-cd79-418d-92f7-ff2244ebff00-registry-certificates\") pod \"image-registry-7c9bd88988-mljkv\" (UID: \"68018594-cd79-418d-92f7-ff2244ebff00\") " pod="openshift-image-registry/image-registry-7c9bd88988-mljkv" Apr 28 19:16:16.695778 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.695744 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/68018594-cd79-418d-92f7-ff2244ebff00-installation-pull-secrets\") pod \"image-registry-7c9bd88988-mljkv\" (UID: \"68018594-cd79-418d-92f7-ff2244ebff00\") " pod="openshift-image-registry/image-registry-7c9bd88988-mljkv" Apr 28 19:16:16.695958 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.695818 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dks7\" (UniqueName: \"kubernetes.io/projected/68018594-cd79-418d-92f7-ff2244ebff00-kube-api-access-6dks7\") pod \"image-registry-7c9bd88988-mljkv\" (UID: \"68018594-cd79-418d-92f7-ff2244ebff00\") " pod="openshift-image-registry/image-registry-7c9bd88988-mljkv" Apr 28 19:16:16.695958 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.695882 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/68018594-cd79-418d-92f7-ff2244ebff00-bound-sa-token\") pod \"image-registry-7c9bd88988-mljkv\" (UID: \"68018594-cd79-418d-92f7-ff2244ebff00\") " pod="openshift-image-registry/image-registry-7c9bd88988-mljkv" Apr 28 19:16:16.696028 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.695953 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cndtg\" (UniqueName: \"kubernetes.io/projected/706dde5a-6656-4009-a585-2d9b3cbd4ecd-kube-api-access-cndtg\") pod \"ingress-canary-qfmts\" (UID: \"706dde5a-6656-4009-a585-2d9b3cbd4ecd\") " pod="openshift-ingress-canary/ingress-canary-qfmts" Apr 28 19:16:16.696028 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.696003 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/706dde5a-6656-4009-a585-2d9b3cbd4ecd-cert\") pod \"ingress-canary-qfmts\" (UID: \"706dde5a-6656-4009-a585-2d9b3cbd4ecd\") " pod="openshift-ingress-canary/ingress-canary-qfmts" Apr 28 19:16:16.796887 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.796851 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2692402f-8242-404a-8d92-642c2dec47fb-config-volume\") pod \"dns-default-hwwt4\" (UID: \"2692402f-8242-404a-8d92-642c2dec47fb\") " pod="openshift-dns/dns-default-hwwt4" Apr 28 19:16:16.796887 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.796907 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2692402f-8242-404a-8d92-642c2dec47fb-metrics-tls\") pod \"dns-default-hwwt4\" (UID: \"2692402f-8242-404a-8d92-642c2dec47fb\") " pod="openshift-dns/dns-default-hwwt4" Apr 28 19:16:16.797100 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.797009 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/68018594-cd79-418d-92f7-ff2244ebff00-bound-sa-token\") pod \"image-registry-7c9bd88988-mljkv\" (UID: \"68018594-cd79-418d-92f7-ff2244ebff00\") " pod="openshift-image-registry/image-registry-7c9bd88988-mljkv" Apr 28 19:16:16.797100 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.797039 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cndtg\" (UniqueName: \"kubernetes.io/projected/706dde5a-6656-4009-a585-2d9b3cbd4ecd-kube-api-access-cndtg\") pod \"ingress-canary-qfmts\" (UID: \"706dde5a-6656-4009-a585-2d9b3cbd4ecd\") " pod="openshift-ingress-canary/ingress-canary-qfmts" Apr 28 19:16:16.797100 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.797063 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/706dde5a-6656-4009-a585-2d9b3cbd4ecd-cert\") pod \"ingress-canary-qfmts\" (UID: \"706dde5a-6656-4009-a585-2d9b3cbd4ecd\") " pod="openshift-ingress-canary/ingress-canary-qfmts" Apr 28 19:16:16.797100 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.797087 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/68018594-cd79-418d-92f7-ff2244ebff00-ca-trust-extracted\") pod \"image-registry-7c9bd88988-mljkv\" (UID: \"68018594-cd79-418d-92f7-ff2244ebff00\") " pod="openshift-image-registry/image-registry-7c9bd88988-mljkv" Apr 28 19:16:16.797289 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.797112 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2692402f-8242-404a-8d92-642c2dec47fb-tmp-dir\") pod \"dns-default-hwwt4\" (UID: \"2692402f-8242-404a-8d92-642c2dec47fb\") " pod="openshift-dns/dns-default-hwwt4" Apr 28 19:16:16.797289 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.797142 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq9h8\" (UniqueName: \"kubernetes.io/projected/2692402f-8242-404a-8d92-642c2dec47fb-kube-api-access-nq9h8\") pod \"dns-default-hwwt4\" (UID: \"2692402f-8242-404a-8d92-642c2dec47fb\") " pod="openshift-dns/dns-default-hwwt4" Apr 28 19:16:16.797289 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.797176 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/68018594-cd79-418d-92f7-ff2244ebff00-registry-tls\") pod \"image-registry-7c9bd88988-mljkv\" (UID: \"68018594-cd79-418d-92f7-ff2244ebff00\") " pod="openshift-image-registry/image-registry-7c9bd88988-mljkv" Apr 28 19:16:16.797289 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:16.797182 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:16:16.797289 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.797203 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68018594-cd79-418d-92f7-ff2244ebff00-trusted-ca\") pod \"image-registry-7c9bd88988-mljkv\" (UID: \"68018594-cd79-418d-92f7-ff2244ebff00\") " pod="openshift-image-registry/image-registry-7c9bd88988-mljkv" Apr 28 19:16:16.797289 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.797250 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/68018594-cd79-418d-92f7-ff2244ebff00-image-registry-private-configuration\") pod \"image-registry-7c9bd88988-mljkv\" (UID: \"68018594-cd79-418d-92f7-ff2244ebff00\") " pod="openshift-image-registry/image-registry-7c9bd88988-mljkv" Apr 28 19:16:16.797289 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:16.797286 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/706dde5a-6656-4009-a585-2d9b3cbd4ecd-cert podName:706dde5a-6656-4009-a585-2d9b3cbd4ecd nodeName:}" failed. No retries permitted until 2026-04-28 19:16:17.297266544 +0000 UTC m=+34.697011600 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/706dde5a-6656-4009-a585-2d9b3cbd4ecd-cert") pod "ingress-canary-qfmts" (UID: "706dde5a-6656-4009-a585-2d9b3cbd4ecd") : secret "canary-serving-cert" not found Apr 28 19:16:16.797578 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:16.797316 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 28 19:16:16.797578 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:16.797335 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7c9bd88988-mljkv: secret "image-registry-tls" not found Apr 28 19:16:16.797578 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:16.797395 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/68018594-cd79-418d-92f7-ff2244ebff00-registry-tls podName:68018594-cd79-418d-92f7-ff2244ebff00 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:17.297378433 +0000 UTC m=+34.697123507 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/68018594-cd79-418d-92f7-ff2244ebff00-registry-tls") pod "image-registry-7c9bd88988-mljkv" (UID: "68018594-cd79-418d-92f7-ff2244ebff00") : secret "image-registry-tls" not found Apr 28 19:16:16.797578 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.797426 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/68018594-cd79-418d-92f7-ff2244ebff00-registry-certificates\") pod \"image-registry-7c9bd88988-mljkv\" (UID: \"68018594-cd79-418d-92f7-ff2244ebff00\") " pod="openshift-image-registry/image-registry-7c9bd88988-mljkv" Apr 28 19:16:16.797578 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.797458 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/68018594-cd79-418d-92f7-ff2244ebff00-installation-pull-secrets\") pod \"image-registry-7c9bd88988-mljkv\" (UID: \"68018594-cd79-418d-92f7-ff2244ebff00\") " pod="openshift-image-registry/image-registry-7c9bd88988-mljkv" Apr 28 19:16:16.797578 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.797487 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6dks7\" (UniqueName: \"kubernetes.io/projected/68018594-cd79-418d-92f7-ff2244ebff00-kube-api-access-6dks7\") pod \"image-registry-7c9bd88988-mljkv\" (UID: \"68018594-cd79-418d-92f7-ff2244ebff00\") " pod="openshift-image-registry/image-registry-7c9bd88988-mljkv" Apr 28 19:16:16.797843 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.797591 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/68018594-cd79-418d-92f7-ff2244ebff00-ca-trust-extracted\") pod \"image-registry-7c9bd88988-mljkv\" (UID: \"68018594-cd79-418d-92f7-ff2244ebff00\") " pod="openshift-image-registry/image-registry-7c9bd88988-mljkv" Apr 28 19:16:16.797960 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.797915 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/68018594-cd79-418d-92f7-ff2244ebff00-registry-certificates\") pod \"image-registry-7c9bd88988-mljkv\" (UID: \"68018594-cd79-418d-92f7-ff2244ebff00\") " pod="openshift-image-registry/image-registry-7c9bd88988-mljkv" Apr 28 19:16:16.798243 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.798212 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68018594-cd79-418d-92f7-ff2244ebff00-trusted-ca\") pod \"image-registry-7c9bd88988-mljkv\" (UID: \"68018594-cd79-418d-92f7-ff2244ebff00\") " pod="openshift-image-registry/image-registry-7c9bd88988-mljkv" Apr 28 19:16:16.802583 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.802553 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/68018594-cd79-418d-92f7-ff2244ebff00-installation-pull-secrets\") pod \"image-registry-7c9bd88988-mljkv\" (UID: \"68018594-cd79-418d-92f7-ff2244ebff00\") " pod="openshift-image-registry/image-registry-7c9bd88988-mljkv" Apr 28 19:16:16.802685 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.802581 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/68018594-cd79-418d-92f7-ff2244ebff00-image-registry-private-configuration\") pod \"image-registry-7c9bd88988-mljkv\" (UID: \"68018594-cd79-418d-92f7-ff2244ebff00\") " pod="openshift-image-registry/image-registry-7c9bd88988-mljkv" Apr 28 19:16:16.806427 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.806403 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dks7\" (UniqueName: \"kubernetes.io/projected/68018594-cd79-418d-92f7-ff2244ebff00-kube-api-access-6dks7\") pod \"image-registry-7c9bd88988-mljkv\" (UID: \"68018594-cd79-418d-92f7-ff2244ebff00\") " pod="openshift-image-registry/image-registry-7c9bd88988-mljkv" Apr 28 19:16:16.806620 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.806603 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/68018594-cd79-418d-92f7-ff2244ebff00-bound-sa-token\") pod \"image-registry-7c9bd88988-mljkv\" (UID: \"68018594-cd79-418d-92f7-ff2244ebff00\") " pod="openshift-image-registry/image-registry-7c9bd88988-mljkv" Apr 28 19:16:16.806826 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.806810 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cndtg\" (UniqueName: \"kubernetes.io/projected/706dde5a-6656-4009-a585-2d9b3cbd4ecd-kube-api-access-cndtg\") pod \"ingress-canary-qfmts\" (UID: \"706dde5a-6656-4009-a585-2d9b3cbd4ecd\") " pod="openshift-ingress-canary/ingress-canary-qfmts" Apr 28 19:16:16.898471 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.898430 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdxgz\" (UniqueName: \"kubernetes.io/projected/40262fc5-4234-4213-8739-f1ff807f34ec-kube-api-access-kdxgz\") pod \"network-check-target-8gbf7\" (UID: \"40262fc5-4234-4213-8739-f1ff807f34ec\") " pod="openshift-network-diagnostics/network-check-target-8gbf7" Apr 28 19:16:16.898639 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.898539 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2692402f-8242-404a-8d92-642c2dec47fb-tmp-dir\") pod \"dns-default-hwwt4\" (UID: \"2692402f-8242-404a-8d92-642c2dec47fb\") " pod="openshift-dns/dns-default-hwwt4" Apr 28 19:16:16.898639 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.898568 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nq9h8\" (UniqueName: \"kubernetes.io/projected/2692402f-8242-404a-8d92-642c2dec47fb-kube-api-access-nq9h8\") pod \"dns-default-hwwt4\" (UID: \"2692402f-8242-404a-8d92-642c2dec47fb\") " pod="openshift-dns/dns-default-hwwt4" Apr 28 19:16:16.898639 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:16.898613 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:16.898639 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:16.898639 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:16.898839 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:16.898650 2565 projected.go:194] Error preparing data for projected volume kube-api-access-kdxgz for pod openshift-network-diagnostics/network-check-target-8gbf7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:16.898839 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:16.898699 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:16.898839 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.898615 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09653a58-e44e-4fb5-a021-58bc08a4765f-metrics-certs\") pod \"network-metrics-daemon-88gvq\" (UID: \"09653a58-e44e-4fb5-a021-58bc08a4765f\") " pod="openshift-multus/network-metrics-daemon-88gvq" Apr 28 19:16:16.898839 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:16.898734 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/40262fc5-4234-4213-8739-f1ff807f34ec-kube-api-access-kdxgz podName:40262fc5-4234-4213-8739-f1ff807f34ec nodeName:}" failed. No retries permitted until 2026-04-28 19:16:48.898697808 +0000 UTC m=+66.298442865 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-kdxgz" (UniqueName: "kubernetes.io/projected/40262fc5-4234-4213-8739-f1ff807f34ec-kube-api-access-kdxgz") pod "network-check-target-8gbf7" (UID: "40262fc5-4234-4213-8739-f1ff807f34ec") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:16.898839 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:16.898770 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09653a58-e44e-4fb5-a021-58bc08a4765f-metrics-certs podName:09653a58-e44e-4fb5-a021-58bc08a4765f nodeName:}" failed. No retries permitted until 2026-04-28 19:16:48.898750886 +0000 UTC m=+66.298495946 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/09653a58-e44e-4fb5-a021-58bc08a4765f-metrics-certs") pod "network-metrics-daemon-88gvq" (UID: "09653a58-e44e-4fb5-a021-58bc08a4765f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:16.899113 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.898842 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2692402f-8242-404a-8d92-642c2dec47fb-config-volume\") pod \"dns-default-hwwt4\" (UID: \"2692402f-8242-404a-8d92-642c2dec47fb\") " pod="openshift-dns/dns-default-hwwt4" Apr 28 19:16:16.899113 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.898872 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2692402f-8242-404a-8d92-642c2dec47fb-metrics-tls\") pod \"dns-default-hwwt4\" (UID: \"2692402f-8242-404a-8d92-642c2dec47fb\") " pod="openshift-dns/dns-default-hwwt4" Apr 28 19:16:16.899113 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.898976 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2692402f-8242-404a-8d92-642c2dec47fb-tmp-dir\") pod \"dns-default-hwwt4\" (UID: \"2692402f-8242-404a-8d92-642c2dec47fb\") " pod="openshift-dns/dns-default-hwwt4" Apr 28 19:16:16.899113 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:16.898988 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:16:16.899113 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:16.899038 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2692402f-8242-404a-8d92-642c2dec47fb-metrics-tls podName:2692402f-8242-404a-8d92-642c2dec47fb nodeName:}" failed. No retries permitted until 2026-04-28 19:16:17.399024875 +0000 UTC m=+34.798769935 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2692402f-8242-404a-8d92-642c2dec47fb-metrics-tls") pod "dns-default-hwwt4" (UID: "2692402f-8242-404a-8d92-642c2dec47fb") : secret "dns-default-metrics-tls" not found Apr 28 19:16:16.899504 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.899481 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2692402f-8242-404a-8d92-642c2dec47fb-config-volume\") pod \"dns-default-hwwt4\" (UID: \"2692402f-8242-404a-8d92-642c2dec47fb\") " pod="openshift-dns/dns-default-hwwt4" Apr 28 19:16:16.907624 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:16.907605 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq9h8\" (UniqueName: \"kubernetes.io/projected/2692402f-8242-404a-8d92-642c2dec47fb-kube-api-access-nq9h8\") pod \"dns-default-hwwt4\" (UID: \"2692402f-8242-404a-8d92-642c2dec47fb\") " pod="openshift-dns/dns-default-hwwt4" Apr 28 19:16:17.302604 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:17.302567 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/706dde5a-6656-4009-a585-2d9b3cbd4ecd-cert\") pod \"ingress-canary-qfmts\" (UID: \"706dde5a-6656-4009-a585-2d9b3cbd4ecd\") " pod="openshift-ingress-canary/ingress-canary-qfmts" Apr 28 19:16:17.303136 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:17.302615 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/68018594-cd79-418d-92f7-ff2244ebff00-registry-tls\") pod \"image-registry-7c9bd88988-mljkv\" (UID: \"68018594-cd79-418d-92f7-ff2244ebff00\") " pod="openshift-image-registry/image-registry-7c9bd88988-mljkv" Apr 28 19:16:17.303136 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:17.302734 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 28 19:16:17.303136 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:17.302758 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7c9bd88988-mljkv: secret "image-registry-tls" not found Apr 28 19:16:17.303136 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:17.302818 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/68018594-cd79-418d-92f7-ff2244ebff00-registry-tls podName:68018594-cd79-418d-92f7-ff2244ebff00 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:18.302798477 +0000 UTC m=+35.702543556 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/68018594-cd79-418d-92f7-ff2244ebff00-registry-tls") pod "image-registry-7c9bd88988-mljkv" (UID: "68018594-cd79-418d-92f7-ff2244ebff00") : secret "image-registry-tls" not found Apr 28 19:16:17.303136 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:17.302733 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:16:17.303136 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:17.302883 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/706dde5a-6656-4009-a585-2d9b3cbd4ecd-cert podName:706dde5a-6656-4009-a585-2d9b3cbd4ecd nodeName:}" failed. No retries permitted until 2026-04-28 19:16:18.302866583 +0000 UTC m=+35.702611642 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/706dde5a-6656-4009-a585-2d9b3cbd4ecd-cert") pod "ingress-canary-qfmts" (UID: "706dde5a-6656-4009-a585-2d9b3cbd4ecd") : secret "canary-serving-cert" not found Apr 28 19:16:17.403628 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:17.403594 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2692402f-8242-404a-8d92-642c2dec47fb-metrics-tls\") pod \"dns-default-hwwt4\" (UID: \"2692402f-8242-404a-8d92-642c2dec47fb\") " pod="openshift-dns/dns-default-hwwt4" Apr 28 19:16:17.403800 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:17.403738 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:16:17.403859 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:17.403806 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2692402f-8242-404a-8d92-642c2dec47fb-metrics-tls podName:2692402f-8242-404a-8d92-642c2dec47fb nodeName:}" failed. No retries permitted until 2026-04-28 19:16:18.403789371 +0000 UTC m=+35.803534436 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2692402f-8242-404a-8d92-642c2dec47fb-metrics-tls") pod "dns-default-hwwt4" (UID: "2692402f-8242-404a-8d92-642c2dec47fb") : secret "dns-default-metrics-tls" not found Apr 28 19:16:17.907869 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:17.907825 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1059a8ad-7584-4de3-8259-c624717ec350-original-pull-secret\") pod \"global-pull-secret-syncer-ncgxg\" (UID: \"1059a8ad-7584-4de3-8259-c624717ec350\") " pod="kube-system/global-pull-secret-syncer-ncgxg" Apr 28 19:16:17.908075 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:17.907996 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:17.908075 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:17.908062 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1059a8ad-7584-4de3-8259-c624717ec350-original-pull-secret podName:1059a8ad-7584-4de3-8259-c624717ec350 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:49.908047232 +0000 UTC m=+67.307792297 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1059a8ad-7584-4de3-8259-c624717ec350-original-pull-secret") pod "global-pull-secret-syncer-ncgxg" (UID: "1059a8ad-7584-4de3-8259-c624717ec350") : object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:18.120299 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:18.120267 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ncgxg" Apr 28 19:16:18.120495 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:18.120312 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8gbf7" Apr 28 19:16:18.120495 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:18.120314 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-88gvq" Apr 28 19:16:18.123622 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:18.123497 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 28 19:16:18.123622 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:18.123514 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-djdhf\"" Apr 28 19:16:18.123622 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:18.123544 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-kx25j\"" Apr 28 19:16:18.124453 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:18.124429 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 28 19:16:18.124563 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:18.124481 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 28 19:16:18.124563 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:18.124437 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 28 19:16:18.310838 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:18.310808 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/706dde5a-6656-4009-a585-2d9b3cbd4ecd-cert\") pod \"ingress-canary-qfmts\" (UID: \"706dde5a-6656-4009-a585-2d9b3cbd4ecd\") " pod="openshift-ingress-canary/ingress-canary-qfmts" Apr 28 19:16:18.311249 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:18.310857 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/68018594-cd79-418d-92f7-ff2244ebff00-registry-tls\") pod \"image-registry-7c9bd88988-mljkv\" (UID: \"68018594-cd79-418d-92f7-ff2244ebff00\") " pod="openshift-image-registry/image-registry-7c9bd88988-mljkv" Apr 28 19:16:18.311249 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:18.310960 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 28 19:16:18.311249 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:18.310974 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7c9bd88988-mljkv: secret "image-registry-tls" not found Apr 28 19:16:18.311249 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:18.310981 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:16:18.311249 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:18.311034 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/68018594-cd79-418d-92f7-ff2244ebff00-registry-tls podName:68018594-cd79-418d-92f7-ff2244ebff00 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:20.311014406 +0000 UTC m=+37.710759466 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/68018594-cd79-418d-92f7-ff2244ebff00-registry-tls") pod "image-registry-7c9bd88988-mljkv" (UID: "68018594-cd79-418d-92f7-ff2244ebff00") : secret "image-registry-tls" not found Apr 28 19:16:18.311249 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:18.311051 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/706dde5a-6656-4009-a585-2d9b3cbd4ecd-cert podName:706dde5a-6656-4009-a585-2d9b3cbd4ecd nodeName:}" failed. No retries permitted until 2026-04-28 19:16:20.311042033 +0000 UTC m=+37.710787091 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/706dde5a-6656-4009-a585-2d9b3cbd4ecd-cert") pod "ingress-canary-qfmts" (UID: "706dde5a-6656-4009-a585-2d9b3cbd4ecd") : secret "canary-serving-cert" not found Apr 28 19:16:18.411885 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:18.411845 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2692402f-8242-404a-8d92-642c2dec47fb-metrics-tls\") pod \"dns-default-hwwt4\" (UID: \"2692402f-8242-404a-8d92-642c2dec47fb\") " pod="openshift-dns/dns-default-hwwt4" Apr 28 19:16:18.412075 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:18.411994 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:16:18.412075 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:18.412068 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2692402f-8242-404a-8d92-642c2dec47fb-metrics-tls podName:2692402f-8242-404a-8d92-642c2dec47fb nodeName:}" failed. No retries permitted until 2026-04-28 19:16:20.412047592 +0000 UTC m=+37.811792669 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2692402f-8242-404a-8d92-642c2dec47fb-metrics-tls") pod "dns-default-hwwt4" (UID: "2692402f-8242-404a-8d92-642c2dec47fb") : secret "dns-default-metrics-tls" not found Apr 28 19:16:19.267021 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:19.266828 2565 generic.go:358] "Generic (PLEG): container finished" podID="f56b141f-364e-495d-9046-30f1c93dbc83" containerID="8dec2888bed1712d1b72448efa1b490690ee6bc99626651fd64d440c64e11d92" exitCode=0 Apr 28 19:16:19.267021 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:19.266912 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ls4b6" event={"ID":"f56b141f-364e-495d-9046-30f1c93dbc83","Type":"ContainerDied","Data":"8dec2888bed1712d1b72448efa1b490690ee6bc99626651fd64d440c64e11d92"} Apr 28 19:16:20.271127 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:20.271086 2565 generic.go:358] "Generic (PLEG): container finished" podID="f56b141f-364e-495d-9046-30f1c93dbc83" containerID="3ce33d49c195b7f95de22bd9caa1661efdbb2236db55e7a9931a5d118eee2917" exitCode=0 Apr 28 19:16:20.271489 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:20.271146 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ls4b6" event={"ID":"f56b141f-364e-495d-9046-30f1c93dbc83","Type":"ContainerDied","Data":"3ce33d49c195b7f95de22bd9caa1661efdbb2236db55e7a9931a5d118eee2917"} Apr 28 19:16:20.326969 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:20.326937 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/706dde5a-6656-4009-a585-2d9b3cbd4ecd-cert\") pod \"ingress-canary-qfmts\" (UID: \"706dde5a-6656-4009-a585-2d9b3cbd4ecd\") " pod="openshift-ingress-canary/ingress-canary-qfmts" Apr 28 19:16:20.327072 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:20.327002 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/68018594-cd79-418d-92f7-ff2244ebff00-registry-tls\") pod \"image-registry-7c9bd88988-mljkv\" (UID: \"68018594-cd79-418d-92f7-ff2244ebff00\") " pod="openshift-image-registry/image-registry-7c9bd88988-mljkv" Apr 28 19:16:20.327570 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:20.327554 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:16:20.327608 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:20.327604 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/706dde5a-6656-4009-a585-2d9b3cbd4ecd-cert podName:706dde5a-6656-4009-a585-2d9b3cbd4ecd nodeName:}" failed. No retries permitted until 2026-04-28 19:16:24.327588711 +0000 UTC m=+41.727333767 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/706dde5a-6656-4009-a585-2d9b3cbd4ecd-cert") pod "ingress-canary-qfmts" (UID: "706dde5a-6656-4009-a585-2d9b3cbd4ecd") : secret "canary-serving-cert" not found Apr 28 19:16:20.327957 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:20.327934 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 28 19:16:20.328021 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:20.327958 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7c9bd88988-mljkv: secret "image-registry-tls" not found Apr 28 19:16:20.328021 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:20.328009 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/68018594-cd79-418d-92f7-ff2244ebff00-registry-tls podName:68018594-cd79-418d-92f7-ff2244ebff00 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:24.327992759 +0000 UTC m=+41.727737836 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/68018594-cd79-418d-92f7-ff2244ebff00-registry-tls") pod "image-registry-7c9bd88988-mljkv" (UID: "68018594-cd79-418d-92f7-ff2244ebff00") : secret "image-registry-tls" not found Apr 28 19:16:20.428255 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:20.428226 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2692402f-8242-404a-8d92-642c2dec47fb-metrics-tls\") pod \"dns-default-hwwt4\" (UID: \"2692402f-8242-404a-8d92-642c2dec47fb\") " pod="openshift-dns/dns-default-hwwt4" Apr 28 19:16:20.428381 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:20.428361 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:16:20.428443 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:20.428432 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2692402f-8242-404a-8d92-642c2dec47fb-metrics-tls podName:2692402f-8242-404a-8d92-642c2dec47fb nodeName:}" failed. No retries permitted until 2026-04-28 19:16:24.428412562 +0000 UTC m=+41.828157618 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2692402f-8242-404a-8d92-642c2dec47fb-metrics-tls") pod "dns-default-hwwt4" (UID: "2692402f-8242-404a-8d92-642c2dec47fb") : secret "dns-default-metrics-tls" not found Apr 28 19:16:21.275444 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:21.275412 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ls4b6" event={"ID":"f56b141f-364e-495d-9046-30f1c93dbc83","Type":"ContainerStarted","Data":"2886e50f6b78f6723136acc1882d43231afa577d932cc3499c1ab5226367257f"} Apr 28 19:16:21.310335 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:21.310289 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-ls4b6" podStartSLOduration=5.183239828 podStartE2EDuration="38.310277879s" podCreationTimestamp="2026-04-28 19:15:43 +0000 UTC" firstStartedPulling="2026-04-28 19:15:45.773757643 +0000 UTC m=+3.173502713" lastFinishedPulling="2026-04-28 19:16:18.900795706 +0000 UTC m=+36.300540764" observedRunningTime="2026-04-28 19:16:21.309540152 +0000 UTC m=+38.709285230" watchObservedRunningTime="2026-04-28 19:16:21.310277879 +0000 UTC m=+38.710022956" Apr 28 19:16:24.354720 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:24.354689 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/706dde5a-6656-4009-a585-2d9b3cbd4ecd-cert\") pod \"ingress-canary-qfmts\" (UID: \"706dde5a-6656-4009-a585-2d9b3cbd4ecd\") " pod="openshift-ingress-canary/ingress-canary-qfmts" Apr 28 19:16:24.355150 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:24.354728 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/68018594-cd79-418d-92f7-ff2244ebff00-registry-tls\") pod \"image-registry-7c9bd88988-mljkv\" (UID: \"68018594-cd79-418d-92f7-ff2244ebff00\") " pod="openshift-image-registry/image-registry-7c9bd88988-mljkv" Apr 28 19:16:24.355150 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:24.354821 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 28 19:16:24.355150 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:24.354831 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7c9bd88988-mljkv: secret "image-registry-tls" not found Apr 28 19:16:24.355150 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:24.354822 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:16:24.355150 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:24.354871 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/68018594-cd79-418d-92f7-ff2244ebff00-registry-tls podName:68018594-cd79-418d-92f7-ff2244ebff00 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:32.35485909 +0000 UTC m=+49.754604167 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/68018594-cd79-418d-92f7-ff2244ebff00-registry-tls") pod "image-registry-7c9bd88988-mljkv" (UID: "68018594-cd79-418d-92f7-ff2244ebff00") : secret "image-registry-tls" not found Apr 28 19:16:24.355150 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:24.354934 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/706dde5a-6656-4009-a585-2d9b3cbd4ecd-cert podName:706dde5a-6656-4009-a585-2d9b3cbd4ecd nodeName:}" failed. No retries permitted until 2026-04-28 19:16:32.354917525 +0000 UTC m=+49.754662583 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/706dde5a-6656-4009-a585-2d9b3cbd4ecd-cert") pod "ingress-canary-qfmts" (UID: "706dde5a-6656-4009-a585-2d9b3cbd4ecd") : secret "canary-serving-cert" not found Apr 28 19:16:24.455495 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:24.455466 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2692402f-8242-404a-8d92-642c2dec47fb-metrics-tls\") pod \"dns-default-hwwt4\" (UID: \"2692402f-8242-404a-8d92-642c2dec47fb\") " pod="openshift-dns/dns-default-hwwt4" Apr 28 19:16:24.455634 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:24.455589 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:16:24.455634 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:24.455632 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2692402f-8242-404a-8d92-642c2dec47fb-metrics-tls podName:2692402f-8242-404a-8d92-642c2dec47fb nodeName:}" failed. No retries permitted until 2026-04-28 19:16:32.45562084 +0000 UTC m=+49.855365896 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2692402f-8242-404a-8d92-642c2dec47fb-metrics-tls") pod "dns-default-hwwt4" (UID: "2692402f-8242-404a-8d92-642c2dec47fb") : secret "dns-default-metrics-tls" not found Apr 28 19:16:32.407013 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:32.406980 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/68018594-cd79-418d-92f7-ff2244ebff00-registry-tls\") pod \"image-registry-7c9bd88988-mljkv\" (UID: \"68018594-cd79-418d-92f7-ff2244ebff00\") " pod="openshift-image-registry/image-registry-7c9bd88988-mljkv" Apr 28 19:16:32.407474 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:32.407075 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/706dde5a-6656-4009-a585-2d9b3cbd4ecd-cert\") pod \"ingress-canary-qfmts\" (UID: \"706dde5a-6656-4009-a585-2d9b3cbd4ecd\") " pod="openshift-ingress-canary/ingress-canary-qfmts" Apr 28 19:16:32.407474 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:32.407151 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 28 19:16:32.407474 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:32.407163 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:16:32.407474 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:32.407174 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7c9bd88988-mljkv: secret "image-registry-tls" not found Apr 28 19:16:32.407474 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:32.407218 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/706dde5a-6656-4009-a585-2d9b3cbd4ecd-cert podName:706dde5a-6656-4009-a585-2d9b3cbd4ecd nodeName:}" failed. No retries permitted until 2026-04-28 19:16:48.407202477 +0000 UTC m=+65.806947536 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/706dde5a-6656-4009-a585-2d9b3cbd4ecd-cert") pod "ingress-canary-qfmts" (UID: "706dde5a-6656-4009-a585-2d9b3cbd4ecd") : secret "canary-serving-cert" not found Apr 28 19:16:32.407474 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:32.407232 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/68018594-cd79-418d-92f7-ff2244ebff00-registry-tls podName:68018594-cd79-418d-92f7-ff2244ebff00 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:48.407226007 +0000 UTC m=+65.806971063 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/68018594-cd79-418d-92f7-ff2244ebff00-registry-tls") pod "image-registry-7c9bd88988-mljkv" (UID: "68018594-cd79-418d-92f7-ff2244ebff00") : secret "image-registry-tls" not found Apr 28 19:16:32.507555 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:32.507523 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2692402f-8242-404a-8d92-642c2dec47fb-metrics-tls\") pod \"dns-default-hwwt4\" (UID: \"2692402f-8242-404a-8d92-642c2dec47fb\") " pod="openshift-dns/dns-default-hwwt4" Apr 28 19:16:32.507704 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:32.507665 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:16:32.507750 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:32.507723 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2692402f-8242-404a-8d92-642c2dec47fb-metrics-tls podName:2692402f-8242-404a-8d92-642c2dec47fb nodeName:}" failed. No retries permitted until 2026-04-28 19:16:48.507709003 +0000 UTC m=+65.907454058 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2692402f-8242-404a-8d92-642c2dec47fb-metrics-tls") pod "dns-default-hwwt4" (UID: "2692402f-8242-404a-8d92-642c2dec47fb") : secret "dns-default-metrics-tls" not found Apr 28 19:16:41.281724 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:41.281697 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-d6twb" Apr 28 19:16:48.417440 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:48.417406 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/706dde5a-6656-4009-a585-2d9b3cbd4ecd-cert\") pod \"ingress-canary-qfmts\" (UID: \"706dde5a-6656-4009-a585-2d9b3cbd4ecd\") " pod="openshift-ingress-canary/ingress-canary-qfmts" Apr 28 19:16:48.417440 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:48.417446 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/68018594-cd79-418d-92f7-ff2244ebff00-registry-tls\") pod \"image-registry-7c9bd88988-mljkv\" (UID: \"68018594-cd79-418d-92f7-ff2244ebff00\") " pod="openshift-image-registry/image-registry-7c9bd88988-mljkv" Apr 28 19:16:48.417846 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:48.417541 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 28 19:16:48.417846 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:48.417552 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7c9bd88988-mljkv: secret "image-registry-tls" not found Apr 28 19:16:48.417846 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:48.417561 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:16:48.417846 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:48.417606 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/68018594-cd79-418d-92f7-ff2244ebff00-registry-tls podName:68018594-cd79-418d-92f7-ff2244ebff00 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:20.417593325 +0000 UTC m=+97.817338381 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/68018594-cd79-418d-92f7-ff2244ebff00-registry-tls") pod "image-registry-7c9bd88988-mljkv" (UID: "68018594-cd79-418d-92f7-ff2244ebff00") : secret "image-registry-tls" not found Apr 28 19:16:48.417846 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:48.417625 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/706dde5a-6656-4009-a585-2d9b3cbd4ecd-cert podName:706dde5a-6656-4009-a585-2d9b3cbd4ecd nodeName:}" failed. No retries permitted until 2026-04-28 19:17:20.41761094 +0000 UTC m=+97.817355996 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/706dde5a-6656-4009-a585-2d9b3cbd4ecd-cert") pod "ingress-canary-qfmts" (UID: "706dde5a-6656-4009-a585-2d9b3cbd4ecd") : secret "canary-serving-cert" not found Apr 28 19:16:48.518198 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:48.518164 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2692402f-8242-404a-8d92-642c2dec47fb-metrics-tls\") pod \"dns-default-hwwt4\" (UID: \"2692402f-8242-404a-8d92-642c2dec47fb\") " pod="openshift-dns/dns-default-hwwt4" Apr 28 19:16:48.518321 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:48.518287 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:16:48.518365 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:48.518336 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2692402f-8242-404a-8d92-642c2dec47fb-metrics-tls podName:2692402f-8242-404a-8d92-642c2dec47fb nodeName:}" failed. No retries permitted until 2026-04-28 19:17:20.518325026 +0000 UTC m=+97.918070081 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2692402f-8242-404a-8d92-642c2dec47fb-metrics-tls") pod "dns-default-hwwt4" (UID: "2692402f-8242-404a-8d92-642c2dec47fb") : secret "dns-default-metrics-tls" not found Apr 28 19:16:48.920245 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:48.920215 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09653a58-e44e-4fb5-a021-58bc08a4765f-metrics-certs\") pod \"network-metrics-daemon-88gvq\" (UID: \"09653a58-e44e-4fb5-a021-58bc08a4765f\") " pod="openshift-multus/network-metrics-daemon-88gvq" Apr 28 19:16:48.920396 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:48.920261 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdxgz\" (UniqueName: \"kubernetes.io/projected/40262fc5-4234-4213-8739-f1ff807f34ec-kube-api-access-kdxgz\") pod \"network-check-target-8gbf7\" (UID: \"40262fc5-4234-4213-8739-f1ff807f34ec\") " pod="openshift-network-diagnostics/network-check-target-8gbf7" Apr 28 19:16:48.922745 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:48.922717 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 28 19:16:48.922932 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:48.922915 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 28 19:16:48.931407 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:48.931387 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 28 19:16:48.931490 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:16:48.931453 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09653a58-e44e-4fb5-a021-58bc08a4765f-metrics-certs podName:09653a58-e44e-4fb5-a021-58bc08a4765f nodeName:}" failed. No retries permitted until 2026-04-28 19:17:52.931436855 +0000 UTC m=+130.331181911 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/09653a58-e44e-4fb5-a021-58bc08a4765f-metrics-certs") pod "network-metrics-daemon-88gvq" (UID: "09653a58-e44e-4fb5-a021-58bc08a4765f") : secret "metrics-daemon-secret" not found Apr 28 19:16:48.933475 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:48.933462 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 28 19:16:48.945284 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:48.945267 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdxgz\" (UniqueName: \"kubernetes.io/projected/40262fc5-4234-4213-8739-f1ff807f34ec-kube-api-access-kdxgz\") pod \"network-check-target-8gbf7\" (UID: \"40262fc5-4234-4213-8739-f1ff807f34ec\") " pod="openshift-network-diagnostics/network-check-target-8gbf7" Apr 28 19:16:49.047017 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:49.046993 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-kx25j\"" Apr 28 19:16:49.055750 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:49.055734 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8gbf7" Apr 28 19:16:49.210941 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:49.210870 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-8gbf7"] Apr 28 19:16:49.214524 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:16:49.214499 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40262fc5_4234_4213_8739_f1ff807f34ec.slice/crio-19df12f359fba443c209ecc0df32d8c1d72cf7a78076d12ea3283d85c6e9920c WatchSource:0}: Error finding container 19df12f359fba443c209ecc0df32d8c1d72cf7a78076d12ea3283d85c6e9920c: Status 404 returned error can't find the container with id 19df12f359fba443c209ecc0df32d8c1d72cf7a78076d12ea3283d85c6e9920c Apr 28 19:16:49.324605 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:49.324576 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-8gbf7" event={"ID":"40262fc5-4234-4213-8739-f1ff807f34ec","Type":"ContainerStarted","Data":"19df12f359fba443c209ecc0df32d8c1d72cf7a78076d12ea3283d85c6e9920c"} Apr 28 19:16:49.929830 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:49.929787 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1059a8ad-7584-4de3-8259-c624717ec350-original-pull-secret\") pod \"global-pull-secret-syncer-ncgxg\" (UID: \"1059a8ad-7584-4de3-8259-c624717ec350\") " pod="kube-system/global-pull-secret-syncer-ncgxg" Apr 28 19:16:49.932359 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:49.932339 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 28 19:16:49.942746 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:49.942724 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1059a8ad-7584-4de3-8259-c624717ec350-original-pull-secret\") pod \"global-pull-secret-syncer-ncgxg\" (UID: \"1059a8ad-7584-4de3-8259-c624717ec350\") " pod="kube-system/global-pull-secret-syncer-ncgxg" Apr 28 19:16:50.231925 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:50.231827 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ncgxg" Apr 28 19:16:50.354653 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:50.354619 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-ncgxg"] Apr 28 19:16:50.358015 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:16:50.357990 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1059a8ad_7584_4de3_8259_c624717ec350.slice/crio-9fec1f9bb8ec4328b3f5b073becce278c90a8e2318656fdf320c9e5d75abf239 WatchSource:0}: Error finding container 9fec1f9bb8ec4328b3f5b073becce278c90a8e2318656fdf320c9e5d75abf239: Status 404 returned error can't find the container with id 9fec1f9bb8ec4328b3f5b073becce278c90a8e2318656fdf320c9e5d75abf239 Apr 28 19:16:51.330751 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:51.330717 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-ncgxg" event={"ID":"1059a8ad-7584-4de3-8259-c624717ec350","Type":"ContainerStarted","Data":"9fec1f9bb8ec4328b3f5b073becce278c90a8e2318656fdf320c9e5d75abf239"} Apr 28 19:16:53.335782 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:53.335749 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-8gbf7" event={"ID":"40262fc5-4234-4213-8739-f1ff807f34ec","Type":"ContainerStarted","Data":"c80f887456bacee3850a53df1a5ea7ecc87a6a7d00dfadcaa6e2d939f3aa8ba8"} Apr 28 19:16:53.336172 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:53.335886 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-8gbf7" Apr 28 19:16:53.356261 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:53.356220 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-8gbf7" podStartSLOduration=67.274544149 podStartE2EDuration="1m10.356208112s" podCreationTimestamp="2026-04-28 19:15:43 +0000 UTC" firstStartedPulling="2026-04-28 19:16:49.216307006 +0000 UTC m=+66.616052062" lastFinishedPulling="2026-04-28 19:16:52.297970952 +0000 UTC m=+69.697716025" observedRunningTime="2026-04-28 19:16:53.355631142 +0000 UTC m=+70.755376220" watchObservedRunningTime="2026-04-28 19:16:53.356208112 +0000 UTC m=+70.755953190" Apr 28 19:16:55.340556 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:55.340527 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-ncgxg" event={"ID":"1059a8ad-7584-4de3-8259-c624717ec350","Type":"ContainerStarted","Data":"e700556b45ddd4fd90167a68d3396c4af4db9c278f952c5e2b1f71ddf239baec"} Apr 28 19:16:55.359167 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:16:55.359129 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-ncgxg" podStartSLOduration=64.75170712 podStartE2EDuration="1m9.359117412s" podCreationTimestamp="2026-04-28 19:15:46 +0000 UTC" firstStartedPulling="2026-04-28 19:16:50.359562119 +0000 UTC m=+67.759307175" lastFinishedPulling="2026-04-28 19:16:54.966972407 +0000 UTC m=+72.366717467" observedRunningTime="2026-04-28 19:16:55.358713044 +0000 UTC m=+72.758458119" watchObservedRunningTime="2026-04-28 19:16:55.359117412 +0000 UTC m=+72.758862487" Apr 28 19:17:20.439608 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:17:20.439565 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/706dde5a-6656-4009-a585-2d9b3cbd4ecd-cert\") pod \"ingress-canary-qfmts\" (UID: \"706dde5a-6656-4009-a585-2d9b3cbd4ecd\") " pod="openshift-ingress-canary/ingress-canary-qfmts" Apr 28 19:17:20.440041 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:17:20.439622 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/68018594-cd79-418d-92f7-ff2244ebff00-registry-tls\") pod \"image-registry-7c9bd88988-mljkv\" (UID: \"68018594-cd79-418d-92f7-ff2244ebff00\") " pod="openshift-image-registry/image-registry-7c9bd88988-mljkv" Apr 28 19:17:20.440041 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:17:20.439703 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:17:20.440041 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:17:20.439741 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 28 19:17:20.440041 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:17:20.439757 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7c9bd88988-mljkv: secret "image-registry-tls" not found Apr 28 19:17:20.440041 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:17:20.439771 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/706dde5a-6656-4009-a585-2d9b3cbd4ecd-cert podName:706dde5a-6656-4009-a585-2d9b3cbd4ecd nodeName:}" failed. No retries permitted until 2026-04-28 19:18:24.439753858 +0000 UTC m=+161.839498916 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/706dde5a-6656-4009-a585-2d9b3cbd4ecd-cert") pod "ingress-canary-qfmts" (UID: "706dde5a-6656-4009-a585-2d9b3cbd4ecd") : secret "canary-serving-cert" not found Apr 28 19:17:20.440041 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:17:20.439809 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/68018594-cd79-418d-92f7-ff2244ebff00-registry-tls podName:68018594-cd79-418d-92f7-ff2244ebff00 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:24.439794975 +0000 UTC m=+161.839540031 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/68018594-cd79-418d-92f7-ff2244ebff00-registry-tls") pod "image-registry-7c9bd88988-mljkv" (UID: "68018594-cd79-418d-92f7-ff2244ebff00") : secret "image-registry-tls" not found Apr 28 19:17:20.540147 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:17:20.540126 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2692402f-8242-404a-8d92-642c2dec47fb-metrics-tls\") pod \"dns-default-hwwt4\" (UID: \"2692402f-8242-404a-8d92-642c2dec47fb\") " pod="openshift-dns/dns-default-hwwt4" Apr 28 19:17:20.540272 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:17:20.540216 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:17:20.540272 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:17:20.540258 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2692402f-8242-404a-8d92-642c2dec47fb-metrics-tls podName:2692402f-8242-404a-8d92-642c2dec47fb nodeName:}" failed. No retries permitted until 2026-04-28 19:18:24.540247183 +0000 UTC m=+161.939992239 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2692402f-8242-404a-8d92-642c2dec47fb-metrics-tls") pod "dns-default-hwwt4" (UID: "2692402f-8242-404a-8d92-642c2dec47fb") : secret "dns-default-metrics-tls" not found Apr 28 19:17:24.339700 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:17:24.339665 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-8gbf7" Apr 28 19:17:52.945862 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:17:52.945809 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09653a58-e44e-4fb5-a021-58bc08a4765f-metrics-certs\") pod \"network-metrics-daemon-88gvq\" (UID: \"09653a58-e44e-4fb5-a021-58bc08a4765f\") " pod="openshift-multus/network-metrics-daemon-88gvq" Apr 28 19:17:52.946335 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:17:52.945956 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 28 19:17:52.946335 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:17:52.946033 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09653a58-e44e-4fb5-a021-58bc08a4765f-metrics-certs podName:09653a58-e44e-4fb5-a021-58bc08a4765f nodeName:}" failed. No retries permitted until 2026-04-28 19:19:54.946010776 +0000 UTC m=+252.345755855 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/09653a58-e44e-4fb5-a021-58bc08a4765f-metrics-certs") pod "network-metrics-daemon-88gvq" (UID: "09653a58-e44e-4fb5-a021-58bc08a4765f") : secret "metrics-daemon-secret" not found Apr 28 19:18:05.574159 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.574122 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-w77g9"] Apr 28 19:18:05.576973 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.576952 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-w77g9" Apr 28 19:18:05.579966 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.579949 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 28 19:18:05.580112 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.580092 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 28 19:18:05.580486 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.580468 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-v2fr4\"" Apr 28 19:18:05.585082 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.585061 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-w77g9"] Apr 28 19:18:05.682209 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.682184 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jbn4f"] Apr 28 19:18:05.684976 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.684961 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jbn4f" Apr 28 19:18:05.687487 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.687467 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 28 19:18:05.687953 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.687933 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-zwvxl\"" Apr 28 19:18:05.688054 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.687955 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 28 19:18:05.688165 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.688150 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 28 19:18:05.689290 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.689270 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-x7t7n"] Apr 28 19:18:05.691821 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.691803 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-z897l"] Apr 28 19:18:05.691952 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.691938 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-x7t7n" Apr 28 19:18:05.693753 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.693735 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 28 19:18:05.694079 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.694061 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 28 19:18:05.694162 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.694119 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 28 19:18:05.694215 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.694142 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 28 19:18:05.694215 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.694195 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-lqh8m\"" Apr 28 19:18:05.694470 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.694453 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-z897l" Apr 28 19:18:05.696392 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.696372 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 28 19:18:05.696817 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.696793 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 28 19:18:05.697557 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.697539 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 28 19:18:05.697796 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.697775 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-xv8rd\"" Apr 28 19:18:05.697868 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.697834 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 28 19:18:05.704084 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.704066 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jbn4f"] Apr 28 19:18:05.705085 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.705063 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-x7t7n"] Apr 28 19:18:05.705398 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.705382 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 28 19:18:05.706015 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.705999 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 28 19:18:05.707862 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.707843 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-z897l"] Apr 28 19:18:05.730953 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.730929 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q6vs\" (UniqueName: \"kubernetes.io/projected/1c325e2f-148f-4151-92f9-55ef3817ae3b-kube-api-access-6q6vs\") pod \"volume-data-source-validator-7c6cbb6c87-w77g9\" (UID: \"1c325e2f-148f-4151-92f9-55ef3817ae3b\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-w77g9" Apr 28 19:18:05.780823 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.780799 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dzfv2"] Apr 28 19:18:05.783870 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.783855 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dzfv2" Apr 28 19:18:05.786030 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.786014 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 28 19:18:05.786134 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.786081 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-htk7t\"" Apr 28 19:18:05.786190 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.786132 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 28 19:18:05.786278 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.786260 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 28 19:18:05.786331 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.786263 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 28 19:18:05.798386 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.798370 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-4h69b"] Apr 28 19:18:05.800997 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.800979 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4h69b" Apr 28 19:18:05.804340 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.804321 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 28 19:18:05.804455 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.804347 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 28 19:18:05.804519 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.804453 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 28 19:18:05.804519 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.804494 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 28 19:18:05.804648 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.804626 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-pdxn8\"" Apr 28 19:18:05.805332 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.805311 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dzfv2"] Apr 28 19:18:05.826229 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.826167 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-4h69b"] Apr 28 19:18:05.831421 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.831398 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58244bb4-99cf-41d7-91d2-e3c4ffe45e20-config\") pod \"console-operator-9d4b6777b-z897l\" (UID: \"58244bb4-99cf-41d7-91d2-e3c4ffe45e20\") " pod="openshift-console-operator/console-operator-9d4b6777b-z897l" Apr 28 19:18:05.831529 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.831452 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/58244bb4-99cf-41d7-91d2-e3c4ffe45e20-trusted-ca\") pod \"console-operator-9d4b6777b-z897l\" (UID: \"58244bb4-99cf-41d7-91d2-e3c4ffe45e20\") " pod="openshift-console-operator/console-operator-9d4b6777b-z897l" Apr 28 19:18:05.831529 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.831515 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6q6vs\" (UniqueName: \"kubernetes.io/projected/1c325e2f-148f-4151-92f9-55ef3817ae3b-kube-api-access-6q6vs\") pod \"volume-data-source-validator-7c6cbb6c87-w77g9\" (UID: \"1c325e2f-148f-4151-92f9-55ef3817ae3b\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-w77g9" Apr 28 19:18:05.831626 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.831578 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a57a1a7-9bff-45c6-a3ab-4fdacd5a6787-service-ca-bundle\") pod \"insights-operator-585dfdc468-x7t7n\" (UID: \"3a57a1a7-9bff-45c6-a3ab-4fdacd5a6787\") " pod="openshift-insights/insights-operator-585dfdc468-x7t7n" Apr 28 19:18:05.831626 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.831613 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a57a1a7-9bff-45c6-a3ab-4fdacd5a6787-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-x7t7n\" (UID: \"3a57a1a7-9bff-45c6-a3ab-4fdacd5a6787\") " pod="openshift-insights/insights-operator-585dfdc468-x7t7n" Apr 28 19:18:05.831724 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.831654 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zp58\" (UniqueName: \"kubernetes.io/projected/58244bb4-99cf-41d7-91d2-e3c4ffe45e20-kube-api-access-4zp58\") pod \"console-operator-9d4b6777b-z897l\" (UID: \"58244bb4-99cf-41d7-91d2-e3c4ffe45e20\") " pod="openshift-console-operator/console-operator-9d4b6777b-z897l" Apr 28 19:18:05.831724 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.831683 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4qcx\" (UniqueName: \"kubernetes.io/projected/23dbedd7-660b-4266-ba11-0c1f93700217-kube-api-access-r4qcx\") pod \"cluster-samples-operator-6dc5bdb6b4-jbn4f\" (UID: \"23dbedd7-660b-4266-ba11-0c1f93700217\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jbn4f" Apr 28 19:18:05.831813 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.831727 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/23dbedd7-660b-4266-ba11-0c1f93700217-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-jbn4f\" (UID: \"23dbedd7-660b-4266-ba11-0c1f93700217\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jbn4f" Apr 28 19:18:05.831813 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.831772 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58244bb4-99cf-41d7-91d2-e3c4ffe45e20-serving-cert\") pod \"console-operator-9d4b6777b-z897l\" (UID: \"58244bb4-99cf-41d7-91d2-e3c4ffe45e20\") " pod="openshift-console-operator/console-operator-9d4b6777b-z897l" Apr 28 19:18:05.831927 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.831847 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fch5v\" (UniqueName: \"kubernetes.io/projected/3a57a1a7-9bff-45c6-a3ab-4fdacd5a6787-kube-api-access-fch5v\") pod \"insights-operator-585dfdc468-x7t7n\" (UID: \"3a57a1a7-9bff-45c6-a3ab-4fdacd5a6787\") " pod="openshift-insights/insights-operator-585dfdc468-x7t7n" Apr 28 19:18:05.831927 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.831880 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a57a1a7-9bff-45c6-a3ab-4fdacd5a6787-serving-cert\") pod \"insights-operator-585dfdc468-x7t7n\" (UID: \"3a57a1a7-9bff-45c6-a3ab-4fdacd5a6787\") " pod="openshift-insights/insights-operator-585dfdc468-x7t7n" Apr 28 19:18:05.831927 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.831915 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/3a57a1a7-9bff-45c6-a3ab-4fdacd5a6787-snapshots\") pod \"insights-operator-585dfdc468-x7t7n\" (UID: \"3a57a1a7-9bff-45c6-a3ab-4fdacd5a6787\") " pod="openshift-insights/insights-operator-585dfdc468-x7t7n" Apr 28 19:18:05.832063 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.831970 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3a57a1a7-9bff-45c6-a3ab-4fdacd5a6787-tmp\") pod \"insights-operator-585dfdc468-x7t7n\" (UID: \"3a57a1a7-9bff-45c6-a3ab-4fdacd5a6787\") " pod="openshift-insights/insights-operator-585dfdc468-x7t7n" Apr 28 19:18:05.839575 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.839556 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q6vs\" (UniqueName: \"kubernetes.io/projected/1c325e2f-148f-4151-92f9-55ef3817ae3b-kube-api-access-6q6vs\") pod \"volume-data-source-validator-7c6cbb6c87-w77g9\" (UID: \"1c325e2f-148f-4151-92f9-55ef3817ae3b\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-w77g9" Apr 28 19:18:05.879333 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.879312 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-wl476"] Apr 28 19:18:05.882179 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.882163 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wl476" Apr 28 19:18:05.884337 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.884319 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 28 19:18:05.884337 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.884336 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 28 19:18:05.884475 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.884341 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-knsnz\"" Apr 28 19:18:05.885783 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.885769 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-w77g9" Apr 28 19:18:05.895481 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.895461 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-wl476"] Apr 28 19:18:05.932399 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.932371 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a57a1a7-9bff-45c6-a3ab-4fdacd5a6787-service-ca-bundle\") pod \"insights-operator-585dfdc468-x7t7n\" (UID: \"3a57a1a7-9bff-45c6-a3ab-4fdacd5a6787\") " pod="openshift-insights/insights-operator-585dfdc468-x7t7n" Apr 28 19:18:05.932549 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.932411 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a57a1a7-9bff-45c6-a3ab-4fdacd5a6787-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-x7t7n\" (UID: \"3a57a1a7-9bff-45c6-a3ab-4fdacd5a6787\") " pod="openshift-insights/insights-operator-585dfdc468-x7t7n" Apr 28 19:18:05.932549 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.932440 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4zp58\" (UniqueName: \"kubernetes.io/projected/58244bb4-99cf-41d7-91d2-e3c4ffe45e20-kube-api-access-4zp58\") pod \"console-operator-9d4b6777b-z897l\" (UID: \"58244bb4-99cf-41d7-91d2-e3c4ffe45e20\") " pod="openshift-console-operator/console-operator-9d4b6777b-z897l" Apr 28 19:18:05.932549 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.932469 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r4qcx\" (UniqueName: \"kubernetes.io/projected/23dbedd7-660b-4266-ba11-0c1f93700217-kube-api-access-r4qcx\") pod \"cluster-samples-operator-6dc5bdb6b4-jbn4f\" (UID: \"23dbedd7-660b-4266-ba11-0c1f93700217\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jbn4f" Apr 28 19:18:05.932549 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.932507 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/23dbedd7-660b-4266-ba11-0c1f93700217-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-jbn4f\" (UID: \"23dbedd7-660b-4266-ba11-0c1f93700217\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jbn4f" Apr 28 19:18:05.932549 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.932530 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58244bb4-99cf-41d7-91d2-e3c4ffe45e20-serving-cert\") pod \"console-operator-9d4b6777b-z897l\" (UID: \"58244bb4-99cf-41d7-91d2-e3c4ffe45e20\") " pod="openshift-console-operator/console-operator-9d4b6777b-z897l" Apr 28 19:18:05.932858 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.932562 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/8bf92c20-e551-497f-995e-ea716db91e5d-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-4h69b\" (UID: \"8bf92c20-e551-497f-995e-ea716db91e5d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4h69b" Apr 28 19:18:05.932858 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.932601 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8bf92c20-e551-497f-995e-ea716db91e5d-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-4h69b\" (UID: \"8bf92c20-e551-497f-995e-ea716db91e5d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4h69b" Apr 28 19:18:05.932858 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.932719 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fch5v\" (UniqueName: \"kubernetes.io/projected/3a57a1a7-9bff-45c6-a3ab-4fdacd5a6787-kube-api-access-fch5v\") pod \"insights-operator-585dfdc468-x7t7n\" (UID: \"3a57a1a7-9bff-45c6-a3ab-4fdacd5a6787\") " pod="openshift-insights/insights-operator-585dfdc468-x7t7n" Apr 28 19:18:05.932858 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.932754 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czpdz\" (UniqueName: \"kubernetes.io/projected/32912883-a580-4985-b771-5a3b8b2d6ab5-kube-api-access-czpdz\") pod \"service-ca-operator-d6fc45fc5-dzfv2\" (UID: \"32912883-a580-4985-b771-5a3b8b2d6ab5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dzfv2" Apr 28 19:18:05.932858 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.932805 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a57a1a7-9bff-45c6-a3ab-4fdacd5a6787-serving-cert\") pod \"insights-operator-585dfdc468-x7t7n\" (UID: \"3a57a1a7-9bff-45c6-a3ab-4fdacd5a6787\") " pod="openshift-insights/insights-operator-585dfdc468-x7t7n" Apr 28 19:18:05.932858 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.932843 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/3a57a1a7-9bff-45c6-a3ab-4fdacd5a6787-snapshots\") pod \"insights-operator-585dfdc468-x7t7n\" (UID: \"3a57a1a7-9bff-45c6-a3ab-4fdacd5a6787\") " pod="openshift-insights/insights-operator-585dfdc468-x7t7n" Apr 28 19:18:05.933266 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.932877 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3a57a1a7-9bff-45c6-a3ab-4fdacd5a6787-tmp\") pod \"insights-operator-585dfdc468-x7t7n\" (UID: \"3a57a1a7-9bff-45c6-a3ab-4fdacd5a6787\") " pod="openshift-insights/insights-operator-585dfdc468-x7t7n" Apr 28 19:18:05.933266 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.932922 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32912883-a580-4985-b771-5a3b8b2d6ab5-serving-cert\") pod \"service-ca-operator-d6fc45fc5-dzfv2\" (UID: \"32912883-a580-4985-b771-5a3b8b2d6ab5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dzfv2" Apr 28 19:18:05.933266 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.932940 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58244bb4-99cf-41d7-91d2-e3c4ffe45e20-config\") pod \"console-operator-9d4b6777b-z897l\" (UID: \"58244bb4-99cf-41d7-91d2-e3c4ffe45e20\") " pod="openshift-console-operator/console-operator-9d4b6777b-z897l" Apr 28 19:18:05.933266 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.932968 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32912883-a580-4985-b771-5a3b8b2d6ab5-config\") pod \"service-ca-operator-d6fc45fc5-dzfv2\" (UID: \"32912883-a580-4985-b771-5a3b8b2d6ab5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dzfv2" Apr 28 19:18:05.933266 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.932987 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj8hs\" (UniqueName: \"kubernetes.io/projected/8bf92c20-e551-497f-995e-ea716db91e5d-kube-api-access-fj8hs\") pod \"cluster-monitoring-operator-75587bd455-4h69b\" (UID: \"8bf92c20-e551-497f-995e-ea716db91e5d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4h69b" Apr 28 19:18:05.933266 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.933007 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/58244bb4-99cf-41d7-91d2-e3c4ffe45e20-trusted-ca\") pod \"console-operator-9d4b6777b-z897l\" (UID: \"58244bb4-99cf-41d7-91d2-e3c4ffe45e20\") " pod="openshift-console-operator/console-operator-9d4b6777b-z897l" Apr 28 19:18:05.933266 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.933100 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a57a1a7-9bff-45c6-a3ab-4fdacd5a6787-service-ca-bundle\") pod \"insights-operator-585dfdc468-x7t7n\" (UID: \"3a57a1a7-9bff-45c6-a3ab-4fdacd5a6787\") " pod="openshift-insights/insights-operator-585dfdc468-x7t7n" Apr 28 19:18:05.934023 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.933382 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3a57a1a7-9bff-45c6-a3ab-4fdacd5a6787-tmp\") pod \"insights-operator-585dfdc468-x7t7n\" (UID: \"3a57a1a7-9bff-45c6-a3ab-4fdacd5a6787\") " pod="openshift-insights/insights-operator-585dfdc468-x7t7n" Apr 28 19:18:05.934186 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.934085 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/58244bb4-99cf-41d7-91d2-e3c4ffe45e20-trusted-ca\") pod \"console-operator-9d4b6777b-z897l\" (UID: \"58244bb4-99cf-41d7-91d2-e3c4ffe45e20\") " pod="openshift-console-operator/console-operator-9d4b6777b-z897l" Apr 28 19:18:05.935367 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.934969 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a57a1a7-9bff-45c6-a3ab-4fdacd5a6787-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-x7t7n\" (UID: \"3a57a1a7-9bff-45c6-a3ab-4fdacd5a6787\") " pod="openshift-insights/insights-operator-585dfdc468-x7t7n" Apr 28 19:18:05.935367 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.935003 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58244bb4-99cf-41d7-91d2-e3c4ffe45e20-config\") pod \"console-operator-9d4b6777b-z897l\" (UID: \"58244bb4-99cf-41d7-91d2-e3c4ffe45e20\") " pod="openshift-console-operator/console-operator-9d4b6777b-z897l" Apr 28 19:18:05.935367 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.935301 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/3a57a1a7-9bff-45c6-a3ab-4fdacd5a6787-snapshots\") pod \"insights-operator-585dfdc468-x7t7n\" (UID: \"3a57a1a7-9bff-45c6-a3ab-4fdacd5a6787\") " pod="openshift-insights/insights-operator-585dfdc468-x7t7n" Apr 28 19:18:05.939297 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:18:05.937835 2565 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 28 19:18:05.939297 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:18:05.937967 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23dbedd7-660b-4266-ba11-0c1f93700217-samples-operator-tls podName:23dbedd7-660b-4266-ba11-0c1f93700217 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:06.437947689 +0000 UTC m=+143.837692755 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/23dbedd7-660b-4266-ba11-0c1f93700217-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-jbn4f" (UID: "23dbedd7-660b-4266-ba11-0c1f93700217") : secret "samples-operator-tls" not found Apr 28 19:18:05.939297 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.939265 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a57a1a7-9bff-45c6-a3ab-4fdacd5a6787-serving-cert\") pod \"insights-operator-585dfdc468-x7t7n\" (UID: \"3a57a1a7-9bff-45c6-a3ab-4fdacd5a6787\") " pod="openshift-insights/insights-operator-585dfdc468-x7t7n" Apr 28 19:18:05.945738 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.945689 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fch5v\" (UniqueName: \"kubernetes.io/projected/3a57a1a7-9bff-45c6-a3ab-4fdacd5a6787-kube-api-access-fch5v\") pod \"insights-operator-585dfdc468-x7t7n\" (UID: \"3a57a1a7-9bff-45c6-a3ab-4fdacd5a6787\") " pod="openshift-insights/insights-operator-585dfdc468-x7t7n" Apr 28 19:18:05.946461 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.946409 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4qcx\" (UniqueName: \"kubernetes.io/projected/23dbedd7-660b-4266-ba11-0c1f93700217-kube-api-access-r4qcx\") pod \"cluster-samples-operator-6dc5bdb6b4-jbn4f\" (UID: \"23dbedd7-660b-4266-ba11-0c1f93700217\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jbn4f" Apr 28 19:18:05.947445 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.947418 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58244bb4-99cf-41d7-91d2-e3c4ffe45e20-serving-cert\") pod \"console-operator-9d4b6777b-z897l\" (UID: \"58244bb4-99cf-41d7-91d2-e3c4ffe45e20\") " pod="openshift-console-operator/console-operator-9d4b6777b-z897l" Apr 28 19:18:05.947527 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:05.947453 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zp58\" (UniqueName: \"kubernetes.io/projected/58244bb4-99cf-41d7-91d2-e3c4ffe45e20-kube-api-access-4zp58\") pod \"console-operator-9d4b6777b-z897l\" (UID: \"58244bb4-99cf-41d7-91d2-e3c4ffe45e20\") " pod="openshift-console-operator/console-operator-9d4b6777b-z897l" Apr 28 19:18:06.004689 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:06.004658 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-x7t7n" Apr 28 19:18:06.005037 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:06.005009 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-w77g9"] Apr 28 19:18:06.009397 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:18:06.009372 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c325e2f_148f_4151_92f9_55ef3817ae3b.slice/crio-ef45fbaaecad06d2deb270ce9c325af44b22ff819bf004105f0fb3ff2e3bd7d5 WatchSource:0}: Error finding container ef45fbaaecad06d2deb270ce9c325af44b22ff819bf004105f0fb3ff2e3bd7d5: Status 404 returned error can't find the container with id ef45fbaaecad06d2deb270ce9c325af44b22ff819bf004105f0fb3ff2e3bd7d5 Apr 28 19:18:06.012605 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:06.012310 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-z897l" Apr 28 19:18:06.038214 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:06.038185 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/8bf92c20-e551-497f-995e-ea716db91e5d-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-4h69b\" (UID: \"8bf92c20-e551-497f-995e-ea716db91e5d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4h69b" Apr 28 19:18:06.038361 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:06.038225 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8bf92c20-e551-497f-995e-ea716db91e5d-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-4h69b\" (UID: \"8bf92c20-e551-497f-995e-ea716db91e5d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4h69b" Apr 28 19:18:06.038361 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:06.038251 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-czpdz\" (UniqueName: \"kubernetes.io/projected/32912883-a580-4985-b771-5a3b8b2d6ab5-kube-api-access-czpdz\") pod \"service-ca-operator-d6fc45fc5-dzfv2\" (UID: \"32912883-a580-4985-b771-5a3b8b2d6ab5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dzfv2" Apr 28 19:18:06.038361 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:06.038294 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32912883-a580-4985-b771-5a3b8b2d6ab5-serving-cert\") pod \"service-ca-operator-d6fc45fc5-dzfv2\" (UID: \"32912883-a580-4985-b771-5a3b8b2d6ab5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dzfv2" Apr 28 19:18:06.038361 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:06.038321 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32912883-a580-4985-b771-5a3b8b2d6ab5-config\") pod \"service-ca-operator-d6fc45fc5-dzfv2\" (UID: \"32912883-a580-4985-b771-5a3b8b2d6ab5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dzfv2" Apr 28 19:18:06.038361 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:06.038345 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fj8hs\" (UniqueName: \"kubernetes.io/projected/8bf92c20-e551-497f-995e-ea716db91e5d-kube-api-access-fj8hs\") pod \"cluster-monitoring-operator-75587bd455-4h69b\" (UID: \"8bf92c20-e551-497f-995e-ea716db91e5d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4h69b" Apr 28 19:18:06.038637 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:06.038384 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d16fc61d-1bc8-4386-8d52-c516641eb2f9-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-wl476\" (UID: \"d16fc61d-1bc8-4386-8d52-c516641eb2f9\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wl476" Apr 28 19:18:06.038637 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:06.038407 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d16fc61d-1bc8-4386-8d52-c516641eb2f9-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wl476\" (UID: \"d16fc61d-1bc8-4386-8d52-c516641eb2f9\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wl476" Apr 28 19:18:06.038637 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:18:06.038378 2565 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 28 19:18:06.038818 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:18:06.038664 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8bf92c20-e551-497f-995e-ea716db91e5d-cluster-monitoring-operator-tls podName:8bf92c20-e551-497f-995e-ea716db91e5d nodeName:}" failed. No retries permitted until 2026-04-28 19:18:06.538619695 +0000 UTC m=+143.938364751 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/8bf92c20-e551-497f-995e-ea716db91e5d-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-4h69b" (UID: "8bf92c20-e551-497f-995e-ea716db91e5d") : secret "cluster-monitoring-operator-tls" not found Apr 28 19:18:06.039069 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:06.039046 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/8bf92c20-e551-497f-995e-ea716db91e5d-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-4h69b\" (UID: \"8bf92c20-e551-497f-995e-ea716db91e5d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4h69b" Apr 28 19:18:06.039129 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:06.039091 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32912883-a580-4985-b771-5a3b8b2d6ab5-config\") pod \"service-ca-operator-d6fc45fc5-dzfv2\" (UID: \"32912883-a580-4985-b771-5a3b8b2d6ab5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dzfv2" Apr 28 19:18:06.040693 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:06.040624 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32912883-a580-4985-b771-5a3b8b2d6ab5-serving-cert\") pod \"service-ca-operator-d6fc45fc5-dzfv2\" (UID: \"32912883-a580-4985-b771-5a3b8b2d6ab5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dzfv2" Apr 28 19:18:06.047055 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:06.047036 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj8hs\" (UniqueName: \"kubernetes.io/projected/8bf92c20-e551-497f-995e-ea716db91e5d-kube-api-access-fj8hs\") pod \"cluster-monitoring-operator-75587bd455-4h69b\" (UID: \"8bf92c20-e551-497f-995e-ea716db91e5d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4h69b" Apr 28 19:18:06.047561 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:06.047543 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-czpdz\" (UniqueName: \"kubernetes.io/projected/32912883-a580-4985-b771-5a3b8b2d6ab5-kube-api-access-czpdz\") pod \"service-ca-operator-d6fc45fc5-dzfv2\" (UID: \"32912883-a580-4985-b771-5a3b8b2d6ab5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dzfv2" Apr 28 19:18:06.092301 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:06.092278 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dzfv2" Apr 28 19:18:06.129554 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:06.129526 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-x7t7n"] Apr 28 19:18:06.133261 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:18:06.133232 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a57a1a7_9bff_45c6_a3ab_4fdacd5a6787.slice/crio-e5d92d67f482cc2b3e005fd323cca4d252e8ae178b45c897d3b5f0ced0d3ce40 WatchSource:0}: Error finding container e5d92d67f482cc2b3e005fd323cca4d252e8ae178b45c897d3b5f0ced0d3ce40: Status 404 returned error can't find the container with id e5d92d67f482cc2b3e005fd323cca4d252e8ae178b45c897d3b5f0ced0d3ce40 Apr 28 19:18:06.138859 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:06.138836 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d16fc61d-1bc8-4386-8d52-c516641eb2f9-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-wl476\" (UID: \"d16fc61d-1bc8-4386-8d52-c516641eb2f9\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wl476" Apr 28 19:18:06.138972 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:06.138871 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d16fc61d-1bc8-4386-8d52-c516641eb2f9-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wl476\" (UID: \"d16fc61d-1bc8-4386-8d52-c516641eb2f9\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wl476" Apr 28 19:18:06.139077 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:18:06.139056 2565 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 28 19:18:06.139145 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:18:06.139134 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d16fc61d-1bc8-4386-8d52-c516641eb2f9-networking-console-plugin-cert podName:d16fc61d-1bc8-4386-8d52-c516641eb2f9 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:06.639112624 +0000 UTC m=+144.038857684 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/d16fc61d-1bc8-4386-8d52-c516641eb2f9-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-wl476" (UID: "d16fc61d-1bc8-4386-8d52-c516641eb2f9") : secret "networking-console-plugin-cert" not found Apr 28 19:18:06.139460 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:06.139438 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d16fc61d-1bc8-4386-8d52-c516641eb2f9-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-wl476\" (UID: \"d16fc61d-1bc8-4386-8d52-c516641eb2f9\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wl476" Apr 28 19:18:06.145545 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:06.145440 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-z897l"] Apr 28 19:18:06.149433 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:18:06.149375 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58244bb4_99cf_41d7_91d2_e3c4ffe45e20.slice/crio-c5b2a85d5fdd48c3023bf47020a79e2193ef26f61f58b41affc648e3136435f1 WatchSource:0}: Error finding container c5b2a85d5fdd48c3023bf47020a79e2193ef26f61f58b41affc648e3136435f1: Status 404 returned error can't find the container with id c5b2a85d5fdd48c3023bf47020a79e2193ef26f61f58b41affc648e3136435f1 Apr 28 19:18:06.212072 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:06.212044 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dzfv2"] Apr 28 19:18:06.214648 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:18:06.214622 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32912883_a580_4985_b771_5a3b8b2d6ab5.slice/crio-8ef7b5d68c27277a125690824fc2cc065df3f4737907ae55bb75b0e4cb177004 WatchSource:0}: Error finding container 8ef7b5d68c27277a125690824fc2cc065df3f4737907ae55bb75b0e4cb177004: Status 404 returned error can't find the container with id 8ef7b5d68c27277a125690824fc2cc065df3f4737907ae55bb75b0e4cb177004 Apr 28 19:18:06.441099 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:06.441028 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/23dbedd7-660b-4266-ba11-0c1f93700217-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-jbn4f\" (UID: \"23dbedd7-660b-4266-ba11-0c1f93700217\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jbn4f" Apr 28 19:18:06.441215 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:18:06.441174 2565 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 28 19:18:06.441253 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:18:06.441230 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23dbedd7-660b-4266-ba11-0c1f93700217-samples-operator-tls podName:23dbedd7-660b-4266-ba11-0c1f93700217 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:07.441215717 +0000 UTC m=+144.840960779 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/23dbedd7-660b-4266-ba11-0c1f93700217-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-jbn4f" (UID: "23dbedd7-660b-4266-ba11-0c1f93700217") : secret "samples-operator-tls" not found Apr 28 19:18:06.468661 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:06.468632 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-z897l" event={"ID":"58244bb4-99cf-41d7-91d2-e3c4ffe45e20","Type":"ContainerStarted","Data":"c5b2a85d5fdd48c3023bf47020a79e2193ef26f61f58b41affc648e3136435f1"} Apr 28 19:18:06.469623 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:06.469602 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-x7t7n" event={"ID":"3a57a1a7-9bff-45c6-a3ab-4fdacd5a6787","Type":"ContainerStarted","Data":"e5d92d67f482cc2b3e005fd323cca4d252e8ae178b45c897d3b5f0ced0d3ce40"} Apr 28 19:18:06.470500 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:06.470473 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dzfv2" event={"ID":"32912883-a580-4985-b771-5a3b8b2d6ab5","Type":"ContainerStarted","Data":"8ef7b5d68c27277a125690824fc2cc065df3f4737907ae55bb75b0e4cb177004"} Apr 28 19:18:06.471361 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:06.471338 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-w77g9" event={"ID":"1c325e2f-148f-4151-92f9-55ef3817ae3b","Type":"ContainerStarted","Data":"ef45fbaaecad06d2deb270ce9c325af44b22ff819bf004105f0fb3ff2e3bd7d5"} Apr 28 19:18:06.542063 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:06.542040 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8bf92c20-e551-497f-995e-ea716db91e5d-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-4h69b\" (UID: \"8bf92c20-e551-497f-995e-ea716db91e5d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4h69b" Apr 28 19:18:06.542231 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:18:06.542210 2565 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 28 19:18:06.542326 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:18:06.542277 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8bf92c20-e551-497f-995e-ea716db91e5d-cluster-monitoring-operator-tls podName:8bf92c20-e551-497f-995e-ea716db91e5d nodeName:}" failed. No retries permitted until 2026-04-28 19:18:07.542263048 +0000 UTC m=+144.942008105 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/8bf92c20-e551-497f-995e-ea716db91e5d-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-4h69b" (UID: "8bf92c20-e551-497f-995e-ea716db91e5d") : secret "cluster-monitoring-operator-tls" not found Apr 28 19:18:06.643444 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:06.643413 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d16fc61d-1bc8-4386-8d52-c516641eb2f9-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wl476\" (UID: \"d16fc61d-1bc8-4386-8d52-c516641eb2f9\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wl476" Apr 28 19:18:06.643807 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:18:06.643579 2565 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 28 19:18:06.643807 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:18:06.643648 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d16fc61d-1bc8-4386-8d52-c516641eb2f9-networking-console-plugin-cert podName:d16fc61d-1bc8-4386-8d52-c516641eb2f9 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:07.643632144 +0000 UTC m=+145.043377199 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/d16fc61d-1bc8-4386-8d52-c516641eb2f9-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-wl476" (UID: "d16fc61d-1bc8-4386-8d52-c516641eb2f9") : secret "networking-console-plugin-cert" not found Apr 28 19:18:07.449565 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:07.449523 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/23dbedd7-660b-4266-ba11-0c1f93700217-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-jbn4f\" (UID: \"23dbedd7-660b-4266-ba11-0c1f93700217\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jbn4f" Apr 28 19:18:07.449749 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:18:07.449673 2565 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 28 19:18:07.449749 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:18:07.449739 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23dbedd7-660b-4266-ba11-0c1f93700217-samples-operator-tls podName:23dbedd7-660b-4266-ba11-0c1f93700217 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:09.449719241 +0000 UTC m=+146.849464304 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/23dbedd7-660b-4266-ba11-0c1f93700217-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-jbn4f" (UID: "23dbedd7-660b-4266-ba11-0c1f93700217") : secret "samples-operator-tls" not found Apr 28 19:18:07.550993 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:07.550961 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8bf92c20-e551-497f-995e-ea716db91e5d-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-4h69b\" (UID: \"8bf92c20-e551-497f-995e-ea716db91e5d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4h69b" Apr 28 19:18:07.551180 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:18:07.551138 2565 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 28 19:18:07.551346 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:18:07.551209 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8bf92c20-e551-497f-995e-ea716db91e5d-cluster-monitoring-operator-tls podName:8bf92c20-e551-497f-995e-ea716db91e5d nodeName:}" failed. No retries permitted until 2026-04-28 19:18:09.551189289 +0000 UTC m=+146.950934346 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/8bf92c20-e551-497f-995e-ea716db91e5d-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-4h69b" (UID: "8bf92c20-e551-497f-995e-ea716db91e5d") : secret "cluster-monitoring-operator-tls" not found Apr 28 19:18:07.652418 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:07.652311 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d16fc61d-1bc8-4386-8d52-c516641eb2f9-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wl476\" (UID: \"d16fc61d-1bc8-4386-8d52-c516641eb2f9\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wl476" Apr 28 19:18:07.652870 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:18:07.652529 2565 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 28 19:18:07.652870 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:18:07.652595 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d16fc61d-1bc8-4386-8d52-c516641eb2f9-networking-console-plugin-cert podName:d16fc61d-1bc8-4386-8d52-c516641eb2f9 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:09.652575099 +0000 UTC m=+147.052320181 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/d16fc61d-1bc8-4386-8d52-c516641eb2f9-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-wl476" (UID: "d16fc61d-1bc8-4386-8d52-c516641eb2f9") : secret "networking-console-plugin-cert" not found Apr 28 19:18:08.477758 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:08.477710 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-w77g9" event={"ID":"1c325e2f-148f-4151-92f9-55ef3817ae3b","Type":"ContainerStarted","Data":"596a19f2701bacb566b7ebe6298306a93aec273a2aa0f729256b741bc740a235"} Apr 28 19:18:08.493239 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:08.493155 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-w77g9" podStartSLOduration=1.99225914 podStartE2EDuration="3.493141481s" podCreationTimestamp="2026-04-28 19:18:05 +0000 UTC" firstStartedPulling="2026-04-28 19:18:06.012711223 +0000 UTC m=+143.412456280" lastFinishedPulling="2026-04-28 19:18:07.513593551 +0000 UTC m=+144.913338621" observedRunningTime="2026-04-28 19:18:08.493072094 +0000 UTC m=+145.892817173" watchObservedRunningTime="2026-04-28 19:18:08.493141481 +0000 UTC m=+145.892886560" Apr 28 19:18:09.469576 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:09.465921 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/23dbedd7-660b-4266-ba11-0c1f93700217-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-jbn4f\" (UID: \"23dbedd7-660b-4266-ba11-0c1f93700217\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jbn4f" Apr 28 19:18:09.469576 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:18:09.466088 2565 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 28 19:18:09.469576 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:18:09.466161 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23dbedd7-660b-4266-ba11-0c1f93700217-samples-operator-tls podName:23dbedd7-660b-4266-ba11-0c1f93700217 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:13.466142607 +0000 UTC m=+150.865887682 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/23dbedd7-660b-4266-ba11-0c1f93700217-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-jbn4f" (UID: "23dbedd7-660b-4266-ba11-0c1f93700217") : secret "samples-operator-tls" not found Apr 28 19:18:09.481215 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:09.481193 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z897l_58244bb4-99cf-41d7-91d2-e3c4ffe45e20/console-operator/0.log" Apr 28 19:18:09.481337 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:09.481229 2565 generic.go:358] "Generic (PLEG): container finished" podID="58244bb4-99cf-41d7-91d2-e3c4ffe45e20" containerID="cb25247dc2b54691b6a1ecd2dc3047e85caa4968614ec29ccbee980afe7f6a4c" exitCode=255 Apr 28 19:18:09.481337 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:09.481288 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-z897l" event={"ID":"58244bb4-99cf-41d7-91d2-e3c4ffe45e20","Type":"ContainerDied","Data":"cb25247dc2b54691b6a1ecd2dc3047e85caa4968614ec29ccbee980afe7f6a4c"} Apr 28 19:18:09.481559 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:09.481530 2565 scope.go:117] "RemoveContainer" containerID="cb25247dc2b54691b6a1ecd2dc3047e85caa4968614ec29ccbee980afe7f6a4c" Apr 28 19:18:09.482773 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:09.482738 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-x7t7n" event={"ID":"3a57a1a7-9bff-45c6-a3ab-4fdacd5a6787","Type":"ContainerStarted","Data":"93d25ebbb0ec8f0ba5874f4d26430fa661c777f949afc903fe619b03d684d52c"} Apr 28 19:18:09.484068 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:09.484037 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dzfv2" event={"ID":"32912883-a580-4985-b771-5a3b8b2d6ab5","Type":"ContainerStarted","Data":"caf7ec225964dab57c8ae3ab5e05e1965e6bab0410cf2ff13de70b90b4e17797"} Apr 28 19:18:09.516582 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:09.516543 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-x7t7n" podStartSLOduration=1.424189725 podStartE2EDuration="4.516531301s" podCreationTimestamp="2026-04-28 19:18:05 +0000 UTC" firstStartedPulling="2026-04-28 19:18:06.135215738 +0000 UTC m=+143.534960808" lastFinishedPulling="2026-04-28 19:18:09.227557328 +0000 UTC m=+146.627302384" observedRunningTime="2026-04-28 19:18:09.515641353 +0000 UTC m=+146.915386434" watchObservedRunningTime="2026-04-28 19:18:09.516531301 +0000 UTC m=+146.916276378" Apr 28 19:18:09.531713 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:09.531674 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dzfv2" podStartSLOduration=1.522315748 podStartE2EDuration="4.531661343s" podCreationTimestamp="2026-04-28 19:18:05 +0000 UTC" firstStartedPulling="2026-04-28 19:18:06.216356546 +0000 UTC m=+143.616101603" lastFinishedPulling="2026-04-28 19:18:09.225702142 +0000 UTC m=+146.625447198" observedRunningTime="2026-04-28 19:18:09.530570268 +0000 UTC m=+146.930315358" watchObservedRunningTime="2026-04-28 19:18:09.531661343 +0000 UTC m=+146.931406420" Apr 28 19:18:09.567309 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:09.567133 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8bf92c20-e551-497f-995e-ea716db91e5d-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-4h69b\" (UID: \"8bf92c20-e551-497f-995e-ea716db91e5d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4h69b" Apr 28 19:18:09.567309 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:18:09.567252 2565 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 28 19:18:09.567452 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:18:09.567315 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8bf92c20-e551-497f-995e-ea716db91e5d-cluster-monitoring-operator-tls podName:8bf92c20-e551-497f-995e-ea716db91e5d nodeName:}" failed. No retries permitted until 2026-04-28 19:18:13.567295559 +0000 UTC m=+150.967040623 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/8bf92c20-e551-497f-995e-ea716db91e5d-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-4h69b" (UID: "8bf92c20-e551-497f-995e-ea716db91e5d") : secret "cluster-monitoring-operator-tls" not found Apr 28 19:18:09.668346 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:09.668158 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d16fc61d-1bc8-4386-8d52-c516641eb2f9-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wl476\" (UID: \"d16fc61d-1bc8-4386-8d52-c516641eb2f9\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wl476" Apr 28 19:18:09.668485 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:18:09.668456 2565 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 28 19:18:09.668544 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:18:09.668518 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d16fc61d-1bc8-4386-8d52-c516641eb2f9-networking-console-plugin-cert podName:d16fc61d-1bc8-4386-8d52-c516641eb2f9 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:13.668498414 +0000 UTC m=+151.068243492 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/d16fc61d-1bc8-4386-8d52-c516641eb2f9-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-wl476" (UID: "d16fc61d-1bc8-4386-8d52-c516641eb2f9") : secret "networking-console-plugin-cert" not found Apr 28 19:18:10.487600 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:10.487573 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z897l_58244bb4-99cf-41d7-91d2-e3c4ffe45e20/console-operator/1.log" Apr 28 19:18:10.487992 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:10.487951 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z897l_58244bb4-99cf-41d7-91d2-e3c4ffe45e20/console-operator/0.log" Apr 28 19:18:10.487992 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:10.487983 2565 generic.go:358] "Generic (PLEG): container finished" podID="58244bb4-99cf-41d7-91d2-e3c4ffe45e20" containerID="76e59102be285f4360ac7af6f437abe5405413ed5280bcc9bb977ed1e34d2fab" exitCode=255 Apr 28 19:18:10.488108 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:10.488012 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-z897l" event={"ID":"58244bb4-99cf-41d7-91d2-e3c4ffe45e20","Type":"ContainerDied","Data":"76e59102be285f4360ac7af6f437abe5405413ed5280bcc9bb977ed1e34d2fab"} Apr 28 19:18:10.488108 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:10.488051 2565 scope.go:117] "RemoveContainer" containerID="cb25247dc2b54691b6a1ecd2dc3047e85caa4968614ec29ccbee980afe7f6a4c" Apr 28 19:18:10.488333 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:10.488314 2565 scope.go:117] "RemoveContainer" containerID="76e59102be285f4360ac7af6f437abe5405413ed5280bcc9bb977ed1e34d2fab" Apr 28 19:18:10.488526 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:18:10.488507 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-z897l_openshift-console-operator(58244bb4-99cf-41d7-91d2-e3c4ffe45e20)\"" pod="openshift-console-operator/console-operator-9d4b6777b-z897l" podUID="58244bb4-99cf-41d7-91d2-e3c4ffe45e20" Apr 28 19:18:11.495044 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:11.495015 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z897l_58244bb4-99cf-41d7-91d2-e3c4ffe45e20/console-operator/1.log" Apr 28 19:18:11.495433 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:11.495318 2565 scope.go:117] "RemoveContainer" containerID="76e59102be285f4360ac7af6f437abe5405413ed5280bcc9bb977ed1e34d2fab" Apr 28 19:18:11.495504 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:18:11.495487 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-z897l_openshift-console-operator(58244bb4-99cf-41d7-91d2-e3c4ffe45e20)\"" pod="openshift-console-operator/console-operator-9d4b6777b-z897l" podUID="58244bb4-99cf-41d7-91d2-e3c4ffe45e20" Apr 28 19:18:11.701185 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:11.701152 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-m69vl"] Apr 28 19:18:11.705312 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:11.705295 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-m69vl" Apr 28 19:18:11.708106 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:11.708086 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-j922v\"" Apr 28 19:18:11.708501 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:11.708474 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 28 19:18:11.709171 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:11.709150 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 28 19:18:11.715082 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:11.715062 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-m69vl"] Apr 28 19:18:11.887557 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:11.887529 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2rk7\" (UniqueName: \"kubernetes.io/projected/0bf50d78-7e47-47df-b7c8-dfec55922b19-kube-api-access-z2rk7\") pod \"migrator-74bb7799d9-m69vl\" (UID: \"0bf50d78-7e47-47df-b7c8-dfec55922b19\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-m69vl" Apr 28 19:18:11.988833 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:11.988802 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z2rk7\" (UniqueName: \"kubernetes.io/projected/0bf50d78-7e47-47df-b7c8-dfec55922b19-kube-api-access-z2rk7\") pod \"migrator-74bb7799d9-m69vl\" (UID: \"0bf50d78-7e47-47df-b7c8-dfec55922b19\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-m69vl" Apr 28 19:18:11.996917 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:11.996873 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2rk7\" (UniqueName: \"kubernetes.io/projected/0bf50d78-7e47-47df-b7c8-dfec55922b19-kube-api-access-z2rk7\") pod \"migrator-74bb7799d9-m69vl\" (UID: \"0bf50d78-7e47-47df-b7c8-dfec55922b19\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-m69vl" Apr 28 19:18:12.014924 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:12.014890 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-m69vl" Apr 28 19:18:12.124408 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:12.124381 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-m69vl"] Apr 28 19:18:12.128108 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:18:12.128085 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bf50d78_7e47_47df_b7c8_dfec55922b19.slice/crio-8d9ecb9232358226abd89659c1ff8e6eaed9397be4e734b3bad4af9e6c58fc2b WatchSource:0}: Error finding container 8d9ecb9232358226abd89659c1ff8e6eaed9397be4e734b3bad4af9e6c58fc2b: Status 404 returned error can't find the container with id 8d9ecb9232358226abd89659c1ff8e6eaed9397be4e734b3bad4af9e6c58fc2b Apr 28 19:18:12.498802 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:12.498771 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-m69vl" event={"ID":"0bf50d78-7e47-47df-b7c8-dfec55922b19","Type":"ContainerStarted","Data":"8d9ecb9232358226abd89659c1ff8e6eaed9397be4e734b3bad4af9e6c58fc2b"} Apr 28 19:18:12.657223 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:12.657193 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-z9h7k"] Apr 28 19:18:12.660160 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:12.660135 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-z9h7k" Apr 28 19:18:12.662212 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:12.662192 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-47c6s\"" Apr 28 19:18:12.662371 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:12.662263 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 28 19:18:12.662955 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:12.662929 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 28 19:18:12.663059 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:12.662933 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 28 19:18:12.663059 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:12.662933 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 28 19:18:12.670104 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:12.670084 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-z9h7k"] Apr 28 19:18:12.795528 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:12.795501 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7ae6723d-1fbf-4777-ad17-79331245fbb1-signing-key\") pod \"service-ca-865cb79987-z9h7k\" (UID: \"7ae6723d-1fbf-4777-ad17-79331245fbb1\") " pod="openshift-service-ca/service-ca-865cb79987-z9h7k" Apr 28 19:18:12.795680 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:12.795620 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7ae6723d-1fbf-4777-ad17-79331245fbb1-signing-cabundle\") pod \"service-ca-865cb79987-z9h7k\" (UID: \"7ae6723d-1fbf-4777-ad17-79331245fbb1\") " pod="openshift-service-ca/service-ca-865cb79987-z9h7k" Apr 28 19:18:12.795733 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:12.795685 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q84v\" (UniqueName: \"kubernetes.io/projected/7ae6723d-1fbf-4777-ad17-79331245fbb1-kube-api-access-8q84v\") pod \"service-ca-865cb79987-z9h7k\" (UID: \"7ae6723d-1fbf-4777-ad17-79331245fbb1\") " pod="openshift-service-ca/service-ca-865cb79987-z9h7k" Apr 28 19:18:12.896053 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:12.896019 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8q84v\" (UniqueName: \"kubernetes.io/projected/7ae6723d-1fbf-4777-ad17-79331245fbb1-kube-api-access-8q84v\") pod \"service-ca-865cb79987-z9h7k\" (UID: \"7ae6723d-1fbf-4777-ad17-79331245fbb1\") " pod="openshift-service-ca/service-ca-865cb79987-z9h7k" Apr 28 19:18:12.896213 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:12.896066 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7ae6723d-1fbf-4777-ad17-79331245fbb1-signing-key\") pod \"service-ca-865cb79987-z9h7k\" (UID: \"7ae6723d-1fbf-4777-ad17-79331245fbb1\") " pod="openshift-service-ca/service-ca-865cb79987-z9h7k" Apr 28 19:18:12.896384 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:12.896356 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7ae6723d-1fbf-4777-ad17-79331245fbb1-signing-cabundle\") pod \"service-ca-865cb79987-z9h7k\" (UID: \"7ae6723d-1fbf-4777-ad17-79331245fbb1\") " pod="openshift-service-ca/service-ca-865cb79987-z9h7k" Apr 28 19:18:12.897650 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:12.897625 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7ae6723d-1fbf-4777-ad17-79331245fbb1-signing-cabundle\") pod \"service-ca-865cb79987-z9h7k\" (UID: \"7ae6723d-1fbf-4777-ad17-79331245fbb1\") " pod="openshift-service-ca/service-ca-865cb79987-z9h7k" Apr 28 19:18:12.898761 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:12.898738 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7ae6723d-1fbf-4777-ad17-79331245fbb1-signing-key\") pod \"service-ca-865cb79987-z9h7k\" (UID: \"7ae6723d-1fbf-4777-ad17-79331245fbb1\") " pod="openshift-service-ca/service-ca-865cb79987-z9h7k" Apr 28 19:18:12.904824 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:12.904805 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q84v\" (UniqueName: \"kubernetes.io/projected/7ae6723d-1fbf-4777-ad17-79331245fbb1-kube-api-access-8q84v\") pod \"service-ca-865cb79987-z9h7k\" (UID: \"7ae6723d-1fbf-4777-ad17-79331245fbb1\") " pod="openshift-service-ca/service-ca-865cb79987-z9h7k" Apr 28 19:18:12.971280 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:12.971253 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-z9h7k" Apr 28 19:18:12.984376 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:12.984357 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-fqkh6_de956b02-f02a-4203-a743-d9efee946739/dns-node-resolver/0.log" Apr 28 19:18:13.312446 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:13.312244 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-z9h7k"] Apr 28 19:18:13.347102 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:18:13.347060 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ae6723d_1fbf_4777_ad17_79331245fbb1.slice/crio-55053d05cb5cf38773049a36ef121bbfa9db430ccffcaeb35752395fbf519c1b WatchSource:0}: Error finding container 55053d05cb5cf38773049a36ef121bbfa9db430ccffcaeb35752395fbf519c1b: Status 404 returned error can't find the container with id 55053d05cb5cf38773049a36ef121bbfa9db430ccffcaeb35752395fbf519c1b Apr 28 19:18:13.500835 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:13.500759 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/23dbedd7-660b-4266-ba11-0c1f93700217-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-jbn4f\" (UID: \"23dbedd7-660b-4266-ba11-0c1f93700217\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jbn4f" Apr 28 19:18:13.501260 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:18:13.500892 2565 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 28 19:18:13.501260 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:18:13.500992 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23dbedd7-660b-4266-ba11-0c1f93700217-samples-operator-tls podName:23dbedd7-660b-4266-ba11-0c1f93700217 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:21.500971076 +0000 UTC m=+158.900716132 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/23dbedd7-660b-4266-ba11-0c1f93700217-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-jbn4f" (UID: "23dbedd7-660b-4266-ba11-0c1f93700217") : secret "samples-operator-tls" not found Apr 28 19:18:13.502735 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:13.502705 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-z9h7k" event={"ID":"7ae6723d-1fbf-4777-ad17-79331245fbb1","Type":"ContainerStarted","Data":"bb59ff156c13235423952e220bc8e95053edaa39fd1de06c93fa7dbacde23f28"} Apr 28 19:18:13.502861 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:13.502743 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-z9h7k" event={"ID":"7ae6723d-1fbf-4777-ad17-79331245fbb1","Type":"ContainerStarted","Data":"55053d05cb5cf38773049a36ef121bbfa9db430ccffcaeb35752395fbf519c1b"} Apr 28 19:18:13.504234 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:13.504209 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-m69vl" event={"ID":"0bf50d78-7e47-47df-b7c8-dfec55922b19","Type":"ContainerStarted","Data":"5f786d55c7dfad59426148d89611fe0474f63617b34f83faaf567e09b4541ac7"} Apr 28 19:18:13.504320 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:13.504239 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-m69vl" event={"ID":"0bf50d78-7e47-47df-b7c8-dfec55922b19","Type":"ContainerStarted","Data":"44c0a9769cfadbe72895142715dfa7fffb1c9ad6cbaa9c612204e671eafd4993"} Apr 28 19:18:13.523461 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:13.523425 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-z9h7k" podStartSLOduration=1.5234144189999999 podStartE2EDuration="1.523414419s" podCreationTimestamp="2026-04-28 19:18:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:18:13.523189968 +0000 UTC m=+150.922935046" watchObservedRunningTime="2026-04-28 19:18:13.523414419 +0000 UTC m=+150.923159496" Apr 28 19:18:13.542856 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:13.542818 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-m69vl" podStartSLOduration=1.432471147 podStartE2EDuration="2.542807503s" podCreationTimestamp="2026-04-28 19:18:11 +0000 UTC" firstStartedPulling="2026-04-28 19:18:12.129978098 +0000 UTC m=+149.529723154" lastFinishedPulling="2026-04-28 19:18:13.240314444 +0000 UTC m=+150.640059510" observedRunningTime="2026-04-28 19:18:13.541633047 +0000 UTC m=+150.941378125" watchObservedRunningTime="2026-04-28 19:18:13.542807503 +0000 UTC m=+150.942552581" Apr 28 19:18:13.601474 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:13.601451 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8bf92c20-e551-497f-995e-ea716db91e5d-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-4h69b\" (UID: \"8bf92c20-e551-497f-995e-ea716db91e5d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4h69b" Apr 28 19:18:13.601701 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:18:13.601674 2565 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 28 19:18:13.601799 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:18:13.601747 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8bf92c20-e551-497f-995e-ea716db91e5d-cluster-monitoring-operator-tls podName:8bf92c20-e551-497f-995e-ea716db91e5d nodeName:}" failed. No retries permitted until 2026-04-28 19:18:21.601728373 +0000 UTC m=+159.001473432 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/8bf92c20-e551-497f-995e-ea716db91e5d-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-4h69b" (UID: "8bf92c20-e551-497f-995e-ea716db91e5d") : secret "cluster-monitoring-operator-tls" not found Apr 28 19:18:13.702061 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:13.702036 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d16fc61d-1bc8-4386-8d52-c516641eb2f9-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wl476\" (UID: \"d16fc61d-1bc8-4386-8d52-c516641eb2f9\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wl476" Apr 28 19:18:13.702147 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:18:13.702132 2565 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 28 19:18:13.702201 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:18:13.702175 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d16fc61d-1bc8-4386-8d52-c516641eb2f9-networking-console-plugin-cert podName:d16fc61d-1bc8-4386-8d52-c516641eb2f9 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:21.702163663 +0000 UTC m=+159.101908720 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/d16fc61d-1bc8-4386-8d52-c516641eb2f9-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-wl476" (UID: "d16fc61d-1bc8-4386-8d52-c516641eb2f9") : secret "networking-console-plugin-cert" not found Apr 28 19:18:13.785444 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:13.785419 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-clds8_96de079e-3abf-48db-8ecf-bcd571c3ed27/node-ca/0.log" Apr 28 19:18:16.013419 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:16.013383 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-z897l" Apr 28 19:18:16.013419 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:16.013418 2565 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-z897l" Apr 28 19:18:16.013848 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:16.013771 2565 scope.go:117] "RemoveContainer" containerID="76e59102be285f4360ac7af6f437abe5405413ed5280bcc9bb977ed1e34d2fab" Apr 28 19:18:16.013972 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:18:16.013954 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-z897l_openshift-console-operator(58244bb4-99cf-41d7-91d2-e3c4ffe45e20)\"" pod="openshift-console-operator/console-operator-9d4b6777b-z897l" podUID="58244bb4-99cf-41d7-91d2-e3c4ffe45e20" Apr 28 19:18:19.518980 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:18:19.518933 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-7c9bd88988-mljkv" podUID="68018594-cd79-418d-92f7-ff2244ebff00" Apr 28 19:18:19.535434 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:18:19.535401 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-qfmts" podUID="706dde5a-6656-4009-a585-2d9b3cbd4ecd" Apr 28 19:18:19.623067 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:18:19.623031 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-hwwt4" podUID="2692402f-8242-404a-8d92-642c2dec47fb" Apr 28 19:18:20.522643 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:20.522610 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hwwt4" Apr 28 19:18:20.523102 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:20.522609 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7c9bd88988-mljkv" Apr 28 19:18:21.138765 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:18:21.138721 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-88gvq" podUID="09653a58-e44e-4fb5-a021-58bc08a4765f" Apr 28 19:18:21.565945 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:21.565910 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/23dbedd7-660b-4266-ba11-0c1f93700217-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-jbn4f\" (UID: \"23dbedd7-660b-4266-ba11-0c1f93700217\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jbn4f" Apr 28 19:18:21.568152 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:21.568126 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/23dbedd7-660b-4266-ba11-0c1f93700217-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-jbn4f\" (UID: \"23dbedd7-660b-4266-ba11-0c1f93700217\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jbn4f" Apr 28 19:18:21.594009 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:21.593984 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jbn4f" Apr 28 19:18:21.674700 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:21.674659 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8bf92c20-e551-497f-995e-ea716db91e5d-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-4h69b\" (UID: \"8bf92c20-e551-497f-995e-ea716db91e5d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4h69b" Apr 28 19:18:21.675033 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:18:21.674938 2565 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 28 19:18:21.675033 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:18:21.675004 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8bf92c20-e551-497f-995e-ea716db91e5d-cluster-monitoring-operator-tls podName:8bf92c20-e551-497f-995e-ea716db91e5d nodeName:}" failed. No retries permitted until 2026-04-28 19:18:37.674985242 +0000 UTC m=+175.074730312 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/8bf92c20-e551-497f-995e-ea716db91e5d-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-4h69b" (UID: "8bf92c20-e551-497f-995e-ea716db91e5d") : secret "cluster-monitoring-operator-tls" not found Apr 28 19:18:21.734222 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:21.734190 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jbn4f"] Apr 28 19:18:21.775630 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:21.775607 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d16fc61d-1bc8-4386-8d52-c516641eb2f9-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wl476\" (UID: \"d16fc61d-1bc8-4386-8d52-c516641eb2f9\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wl476" Apr 28 19:18:21.775754 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:18:21.775737 2565 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 28 19:18:21.775801 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:18:21.775796 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d16fc61d-1bc8-4386-8d52-c516641eb2f9-networking-console-plugin-cert podName:d16fc61d-1bc8-4386-8d52-c516641eb2f9 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:37.775778919 +0000 UTC m=+175.175523974 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/d16fc61d-1bc8-4386-8d52-c516641eb2f9-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-wl476" (UID: "d16fc61d-1bc8-4386-8d52-c516641eb2f9") : secret "networking-console-plugin-cert" not found Apr 28 19:18:22.529442 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:22.529405 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jbn4f" event={"ID":"23dbedd7-660b-4266-ba11-0c1f93700217","Type":"ContainerStarted","Data":"01833702674b8d848a11fc1bc4e0948bcca5eb1c6707cbc9b2a85b871d136b67"} Apr 28 19:18:24.497663 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:24.497611 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/706dde5a-6656-4009-a585-2d9b3cbd4ecd-cert\") pod \"ingress-canary-qfmts\" (UID: \"706dde5a-6656-4009-a585-2d9b3cbd4ecd\") " pod="openshift-ingress-canary/ingress-canary-qfmts" Apr 28 19:18:24.497663 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:24.497669 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/68018594-cd79-418d-92f7-ff2244ebff00-registry-tls\") pod \"image-registry-7c9bd88988-mljkv\" (UID: \"68018594-cd79-418d-92f7-ff2244ebff00\") " pod="openshift-image-registry/image-registry-7c9bd88988-mljkv" Apr 28 19:18:24.500095 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:24.500069 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/68018594-cd79-418d-92f7-ff2244ebff00-registry-tls\") pod \"image-registry-7c9bd88988-mljkv\" (UID: \"68018594-cd79-418d-92f7-ff2244ebff00\") " pod="openshift-image-registry/image-registry-7c9bd88988-mljkv" Apr 28 19:18:24.500189 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:24.500070 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/706dde5a-6656-4009-a585-2d9b3cbd4ecd-cert\") pod \"ingress-canary-qfmts\" (UID: \"706dde5a-6656-4009-a585-2d9b3cbd4ecd\") " pod="openshift-ingress-canary/ingress-canary-qfmts" Apr 28 19:18:24.536403 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:24.536370 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jbn4f" event={"ID":"23dbedd7-660b-4266-ba11-0c1f93700217","Type":"ContainerStarted","Data":"113825cdda4f4a4817deaa528747ccae42e421c0525798af46f7a1453f392a38"} Apr 28 19:18:24.536561 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:24.536408 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jbn4f" event={"ID":"23dbedd7-660b-4266-ba11-0c1f93700217","Type":"ContainerStarted","Data":"bdba4b76ffc4a069f4097c7fa00bf1bb02d49da9e6d92739d205b0f7c5707246"} Apr 28 19:18:24.553624 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:24.553578 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jbn4f" podStartSLOduration=17.705755475 podStartE2EDuration="19.553563324s" podCreationTimestamp="2026-04-28 19:18:05 +0000 UTC" firstStartedPulling="2026-04-28 19:18:21.773080943 +0000 UTC m=+159.172825999" lastFinishedPulling="2026-04-28 19:18:23.620888778 +0000 UTC m=+161.020633848" observedRunningTime="2026-04-28 19:18:24.553145557 +0000 UTC m=+161.952890660" watchObservedRunningTime="2026-04-28 19:18:24.553563324 +0000 UTC m=+161.953308402" Apr 28 19:18:24.598511 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:24.598474 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2692402f-8242-404a-8d92-642c2dec47fb-metrics-tls\") pod \"dns-default-hwwt4\" (UID: \"2692402f-8242-404a-8d92-642c2dec47fb\") " pod="openshift-dns/dns-default-hwwt4" Apr 28 19:18:24.600819 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:24.600796 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2692402f-8242-404a-8d92-642c2dec47fb-metrics-tls\") pod \"dns-default-hwwt4\" (UID: \"2692402f-8242-404a-8d92-642c2dec47fb\") " pod="openshift-dns/dns-default-hwwt4" Apr 28 19:18:24.725573 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:24.725539 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-lcrng\"" Apr 28 19:18:24.725738 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:24.725577 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-qcg5d\"" Apr 28 19:18:24.734570 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:24.734548 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hwwt4" Apr 28 19:18:24.734611 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:24.734585 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7c9bd88988-mljkv" Apr 28 19:18:24.871799 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:24.871767 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7c9bd88988-mljkv"] Apr 28 19:18:24.875169 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:18:24.875142 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68018594_cd79_418d_92f7_ff2244ebff00.slice/crio-c631d1609d7b31b340b0784c6faec4e8a6eca2fad439c59b1f6c3ac89466cf2d WatchSource:0}: Error finding container c631d1609d7b31b340b0784c6faec4e8a6eca2fad439c59b1f6c3ac89466cf2d: Status 404 returned error can't find the container with id c631d1609d7b31b340b0784c6faec4e8a6eca2fad439c59b1f6c3ac89466cf2d Apr 28 19:18:24.885719 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:24.885697 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hwwt4"] Apr 28 19:18:24.888048 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:18:24.888023 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2692402f_8242_404a_8d92_642c2dec47fb.slice/crio-b285fce46bdb327f3cd36300a853fe15fc985173f899a46b654f10ad53abe22c WatchSource:0}: Error finding container b285fce46bdb327f3cd36300a853fe15fc985173f899a46b654f10ad53abe22c: Status 404 returned error can't find the container with id b285fce46bdb327f3cd36300a853fe15fc985173f899a46b654f10ad53abe22c Apr 28 19:18:25.540632 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:25.540591 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hwwt4" event={"ID":"2692402f-8242-404a-8d92-642c2dec47fb","Type":"ContainerStarted","Data":"b285fce46bdb327f3cd36300a853fe15fc985173f899a46b654f10ad53abe22c"} Apr 28 19:18:25.542091 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:25.542064 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7c9bd88988-mljkv" event={"ID":"68018594-cd79-418d-92f7-ff2244ebff00","Type":"ContainerStarted","Data":"9cd1815ae4b383e2dd1f7791f1940f0c757787daab5fdcc3ee54f5c36be1731c"} Apr 28 19:18:25.542238 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:25.542099 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7c9bd88988-mljkv" event={"ID":"68018594-cd79-418d-92f7-ff2244ebff00","Type":"ContainerStarted","Data":"c631d1609d7b31b340b0784c6faec4e8a6eca2fad439c59b1f6c3ac89466cf2d"} Apr 28 19:18:25.565039 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:25.564993 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7c9bd88988-mljkv" podStartSLOduration=162.564980289 podStartE2EDuration="2m42.564980289s" podCreationTimestamp="2026-04-28 19:15:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:18:25.564395193 +0000 UTC m=+162.964140305" watchObservedRunningTime="2026-04-28 19:18:25.564980289 +0000 UTC m=+162.964725366" Apr 28 19:18:26.546765 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:26.546726 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hwwt4" event={"ID":"2692402f-8242-404a-8d92-642c2dec47fb","Type":"ContainerStarted","Data":"0e9ce743d0916f0104622b85060d9c1161ec635da128762aa8be26cd71205093"} Apr 28 19:18:26.546765 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:26.546765 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hwwt4" event={"ID":"2692402f-8242-404a-8d92-642c2dec47fb","Type":"ContainerStarted","Data":"2c094d5790ed972240e6cdd639a7c2870020730f009787e732e658ef1e547fbb"} Apr 28 19:18:26.547239 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:26.547103 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7c9bd88988-mljkv" Apr 28 19:18:26.547239 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:26.547140 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-hwwt4" Apr 28 19:18:26.564465 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:26.564427 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-hwwt4" podStartSLOduration=129.389682943 podStartE2EDuration="2m10.564417346s" podCreationTimestamp="2026-04-28 19:16:16 +0000 UTC" firstStartedPulling="2026-04-28 19:18:24.889821968 +0000 UTC m=+162.289567037" lastFinishedPulling="2026-04-28 19:18:26.064556377 +0000 UTC m=+163.464301440" observedRunningTime="2026-04-28 19:18:26.56401519 +0000 UTC m=+163.963760267" watchObservedRunningTime="2026-04-28 19:18:26.564417346 +0000 UTC m=+163.964162421" Apr 28 19:18:27.120248 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:27.120221 2565 scope.go:117] "RemoveContainer" containerID="76e59102be285f4360ac7af6f437abe5405413ed5280bcc9bb977ed1e34d2fab" Apr 28 19:18:27.551169 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:27.551145 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z897l_58244bb4-99cf-41d7-91d2-e3c4ffe45e20/console-operator/2.log" Apr 28 19:18:27.551516 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:27.551487 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z897l_58244bb4-99cf-41d7-91d2-e3c4ffe45e20/console-operator/1.log" Apr 28 19:18:27.551552 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:27.551520 2565 generic.go:358] "Generic (PLEG): container finished" podID="58244bb4-99cf-41d7-91d2-e3c4ffe45e20" containerID="4881ab008b562290edc8426e083fc0db542fd6250c03e7a8c3561276008ab36d" exitCode=255 Apr 28 19:18:27.551621 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:27.551599 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-z897l" event={"ID":"58244bb4-99cf-41d7-91d2-e3c4ffe45e20","Type":"ContainerDied","Data":"4881ab008b562290edc8426e083fc0db542fd6250c03e7a8c3561276008ab36d"} Apr 28 19:18:27.551665 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:27.551646 2565 scope.go:117] "RemoveContainer" containerID="76e59102be285f4360ac7af6f437abe5405413ed5280bcc9bb977ed1e34d2fab" Apr 28 19:18:27.552090 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:27.552061 2565 scope.go:117] "RemoveContainer" containerID="4881ab008b562290edc8426e083fc0db542fd6250c03e7a8c3561276008ab36d" Apr 28 19:18:27.552251 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:18:27.552225 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-z897l_openshift-console-operator(58244bb4-99cf-41d7-91d2-e3c4ffe45e20)\"" pod="openshift-console-operator/console-operator-9d4b6777b-z897l" podUID="58244bb4-99cf-41d7-91d2-e3c4ffe45e20" Apr 28 19:18:28.555618 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:28.555590 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z897l_58244bb4-99cf-41d7-91d2-e3c4ffe45e20/console-operator/2.log" Apr 28 19:18:32.119959 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:32.119812 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qfmts" Apr 28 19:18:32.122786 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:32.122764 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-rkrph\"" Apr 28 19:18:32.130599 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:32.130581 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qfmts" Apr 28 19:18:32.245412 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:32.245385 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qfmts"] Apr 28 19:18:32.248721 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:18:32.248694 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod706dde5a_6656_4009_a585_2d9b3cbd4ecd.slice/crio-a208396b6afafe0137f07adebdc49fb4cb43024441dd5da312440011695e6bf8 WatchSource:0}: Error finding container a208396b6afafe0137f07adebdc49fb4cb43024441dd5da312440011695e6bf8: Status 404 returned error can't find the container with id a208396b6afafe0137f07adebdc49fb4cb43024441dd5da312440011695e6bf8 Apr 28 19:18:32.567219 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:32.567189 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qfmts" event={"ID":"706dde5a-6656-4009-a585-2d9b3cbd4ecd","Type":"ContainerStarted","Data":"a208396b6afafe0137f07adebdc49fb4cb43024441dd5da312440011695e6bf8"} Apr 28 19:18:33.121600 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:33.121572 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-88gvq" Apr 28 19:18:33.369221 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:33.369193 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-h4x8g"] Apr 28 19:18:33.374003 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:33.373951 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-h4x8g" Apr 28 19:18:33.376263 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:33.376230 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 28 19:18:33.376793 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:33.376770 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-ql9lz\"" Apr 28 19:18:33.376793 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:33.376781 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 28 19:18:33.389143 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:33.389103 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-h4x8g"] Apr 28 19:18:33.464824 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:33.464791 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb4pb\" (UniqueName: \"kubernetes.io/projected/114be85d-4b9f-4f04-b1bb-9d17a166efa5-kube-api-access-kb4pb\") pod \"insights-runtime-extractor-h4x8g\" (UID: \"114be85d-4b9f-4f04-b1bb-9d17a166efa5\") " pod="openshift-insights/insights-runtime-extractor-h4x8g" Apr 28 19:18:33.464824 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:33.464821 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/114be85d-4b9f-4f04-b1bb-9d17a166efa5-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-h4x8g\" (UID: \"114be85d-4b9f-4f04-b1bb-9d17a166efa5\") " pod="openshift-insights/insights-runtime-extractor-h4x8g" Apr 28 19:18:33.464997 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:33.464954 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/114be85d-4b9f-4f04-b1bb-9d17a166efa5-data-volume\") pod \"insights-runtime-extractor-h4x8g\" (UID: \"114be85d-4b9f-4f04-b1bb-9d17a166efa5\") " pod="openshift-insights/insights-runtime-extractor-h4x8g" Apr 28 19:18:33.464997 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:33.464981 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/114be85d-4b9f-4f04-b1bb-9d17a166efa5-crio-socket\") pod \"insights-runtime-extractor-h4x8g\" (UID: \"114be85d-4b9f-4f04-b1bb-9d17a166efa5\") " pod="openshift-insights/insights-runtime-extractor-h4x8g" Apr 28 19:18:33.465093 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:33.465006 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/114be85d-4b9f-4f04-b1bb-9d17a166efa5-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-h4x8g\" (UID: \"114be85d-4b9f-4f04-b1bb-9d17a166efa5\") " pod="openshift-insights/insights-runtime-extractor-h4x8g" Apr 28 19:18:33.565728 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:33.565697 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/114be85d-4b9f-4f04-b1bb-9d17a166efa5-crio-socket\") pod \"insights-runtime-extractor-h4x8g\" (UID: \"114be85d-4b9f-4f04-b1bb-9d17a166efa5\") " pod="openshift-insights/insights-runtime-extractor-h4x8g" Apr 28 19:18:33.565922 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:33.565744 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/114be85d-4b9f-4f04-b1bb-9d17a166efa5-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-h4x8g\" (UID: \"114be85d-4b9f-4f04-b1bb-9d17a166efa5\") " pod="openshift-insights/insights-runtime-extractor-h4x8g" Apr 28 19:18:33.565922 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:33.565805 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kb4pb\" (UniqueName: \"kubernetes.io/projected/114be85d-4b9f-4f04-b1bb-9d17a166efa5-kube-api-access-kb4pb\") pod \"insights-runtime-extractor-h4x8g\" (UID: \"114be85d-4b9f-4f04-b1bb-9d17a166efa5\") " pod="openshift-insights/insights-runtime-extractor-h4x8g" Apr 28 19:18:33.565922 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:33.565830 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/114be85d-4b9f-4f04-b1bb-9d17a166efa5-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-h4x8g\" (UID: \"114be85d-4b9f-4f04-b1bb-9d17a166efa5\") " pod="openshift-insights/insights-runtime-extractor-h4x8g" Apr 28 19:18:33.565922 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:33.565846 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/114be85d-4b9f-4f04-b1bb-9d17a166efa5-crio-socket\") pod \"insights-runtime-extractor-h4x8g\" (UID: \"114be85d-4b9f-4f04-b1bb-9d17a166efa5\") " pod="openshift-insights/insights-runtime-extractor-h4x8g" Apr 28 19:18:33.566126 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:33.565952 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/114be85d-4b9f-4f04-b1bb-9d17a166efa5-data-volume\") pod \"insights-runtime-extractor-h4x8g\" (UID: \"114be85d-4b9f-4f04-b1bb-9d17a166efa5\") " pod="openshift-insights/insights-runtime-extractor-h4x8g" Apr 28 19:18:33.566366 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:33.566336 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/114be85d-4b9f-4f04-b1bb-9d17a166efa5-data-volume\") pod \"insights-runtime-extractor-h4x8g\" (UID: \"114be85d-4b9f-4f04-b1bb-9d17a166efa5\") " pod="openshift-insights/insights-runtime-extractor-h4x8g" Apr 28 19:18:33.566600 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:33.566579 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/114be85d-4b9f-4f04-b1bb-9d17a166efa5-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-h4x8g\" (UID: \"114be85d-4b9f-4f04-b1bb-9d17a166efa5\") " pod="openshift-insights/insights-runtime-extractor-h4x8g" Apr 28 19:18:33.568542 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:33.568511 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/114be85d-4b9f-4f04-b1bb-9d17a166efa5-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-h4x8g\" (UID: \"114be85d-4b9f-4f04-b1bb-9d17a166efa5\") " pod="openshift-insights/insights-runtime-extractor-h4x8g" Apr 28 19:18:33.575291 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:33.575270 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb4pb\" (UniqueName: \"kubernetes.io/projected/114be85d-4b9f-4f04-b1bb-9d17a166efa5-kube-api-access-kb4pb\") pod \"insights-runtime-extractor-h4x8g\" (UID: \"114be85d-4b9f-4f04-b1bb-9d17a166efa5\") " pod="openshift-insights/insights-runtime-extractor-h4x8g" Apr 28 19:18:33.684591 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:33.684514 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-h4x8g" Apr 28 19:18:33.931111 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:33.931079 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-h4x8g"] Apr 28 19:18:33.933572 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:18:33.933548 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod114be85d_4b9f_4f04_b1bb_9d17a166efa5.slice/crio-4c39ceeda4b25f93434da38e566fb66525eb1a0af9be347383e7a32cffb849e5 WatchSource:0}: Error finding container 4c39ceeda4b25f93434da38e566fb66525eb1a0af9be347383e7a32cffb849e5: Status 404 returned error can't find the container with id 4c39ceeda4b25f93434da38e566fb66525eb1a0af9be347383e7a32cffb849e5 Apr 28 19:18:34.574584 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:34.574523 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qfmts" event={"ID":"706dde5a-6656-4009-a585-2d9b3cbd4ecd","Type":"ContainerStarted","Data":"92487d14b7da022e259789517800e28b8d40316d04bace76248e79388a4d30d6"} Apr 28 19:18:34.576218 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:34.576196 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-h4x8g" event={"ID":"114be85d-4b9f-4f04-b1bb-9d17a166efa5","Type":"ContainerStarted","Data":"cfdc7250e4ebe2f473f80b6d1d28b4df3cf2c94b49bc06d6694ab6c0c118f3b3"} Apr 28 19:18:34.576291 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:34.576222 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-h4x8g" event={"ID":"114be85d-4b9f-4f04-b1bb-9d17a166efa5","Type":"ContainerStarted","Data":"f7b9e25378776556bbca5be21b3a187699963388b2a521c73cf964739870e0b3"} Apr 28 19:18:34.576291 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:34.576236 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-h4x8g" event={"ID":"114be85d-4b9f-4f04-b1bb-9d17a166efa5","Type":"ContainerStarted","Data":"4c39ceeda4b25f93434da38e566fb66525eb1a0af9be347383e7a32cffb849e5"} Apr 28 19:18:34.590367 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:34.590312 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-qfmts" podStartSLOduration=136.977247944 podStartE2EDuration="2m18.590296617s" podCreationTimestamp="2026-04-28 19:16:16 +0000 UTC" firstStartedPulling="2026-04-28 19:18:32.250565426 +0000 UTC m=+169.650310483" lastFinishedPulling="2026-04-28 19:18:33.86361408 +0000 UTC m=+171.263359156" observedRunningTime="2026-04-28 19:18:34.589749796 +0000 UTC m=+171.989494873" watchObservedRunningTime="2026-04-28 19:18:34.590296617 +0000 UTC m=+171.990041697" Apr 28 19:18:36.013149 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:36.013089 2565 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-z897l" Apr 28 19:18:36.013149 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:36.013126 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-z897l" Apr 28 19:18:36.013523 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:36.013420 2565 scope.go:117] "RemoveContainer" containerID="4881ab008b562290edc8426e083fc0db542fd6250c03e7a8c3561276008ab36d" Apr 28 19:18:36.013590 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:18:36.013568 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-z897l_openshift-console-operator(58244bb4-99cf-41d7-91d2-e3c4ffe45e20)\"" pod="openshift-console-operator/console-operator-9d4b6777b-z897l" podUID="58244bb4-99cf-41d7-91d2-e3c4ffe45e20" Apr 28 19:18:36.553552 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:36.553519 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-hwwt4" Apr 28 19:18:36.583352 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:36.583317 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-h4x8g" event={"ID":"114be85d-4b9f-4f04-b1bb-9d17a166efa5","Type":"ContainerStarted","Data":"da8523359e1d663574415365cade9485e9411ef27b4dae293db8838a65c5df18"} Apr 28 19:18:36.604712 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:36.604674 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-h4x8g" podStartSLOduration=1.828905313 podStartE2EDuration="3.604660416s" podCreationTimestamp="2026-04-28 19:18:33 +0000 UTC" firstStartedPulling="2026-04-28 19:18:33.987198086 +0000 UTC m=+171.386943144" lastFinishedPulling="2026-04-28 19:18:35.762953177 +0000 UTC m=+173.162698247" observedRunningTime="2026-04-28 19:18:36.604343564 +0000 UTC m=+174.004088643" watchObservedRunningTime="2026-04-28 19:18:36.604660416 +0000 UTC m=+174.004405494" Apr 28 19:18:37.703051 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:37.703015 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8bf92c20-e551-497f-995e-ea716db91e5d-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-4h69b\" (UID: \"8bf92c20-e551-497f-995e-ea716db91e5d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4h69b" Apr 28 19:18:37.705300 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:37.705274 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8bf92c20-e551-497f-995e-ea716db91e5d-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-4h69b\" (UID: \"8bf92c20-e551-497f-995e-ea716db91e5d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4h69b" Apr 28 19:18:37.803704 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:37.803679 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d16fc61d-1bc8-4386-8d52-c516641eb2f9-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wl476\" (UID: \"d16fc61d-1bc8-4386-8d52-c516641eb2f9\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wl476" Apr 28 19:18:37.805928 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:37.805885 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d16fc61d-1bc8-4386-8d52-c516641eb2f9-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wl476\" (UID: \"d16fc61d-1bc8-4386-8d52-c516641eb2f9\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wl476" Apr 28 19:18:37.909369 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:37.909345 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4h69b" Apr 28 19:18:37.992221 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:37.992190 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wl476" Apr 28 19:18:38.040557 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:38.040508 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-4h69b"] Apr 28 19:18:38.045297 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:18:38.045269 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bf92c20_e551_497f_995e_ea716db91e5d.slice/crio-8f67eb32913dc4b8e9ccac7ce2fe2c8a99b0a4230b2df67a12dc7feedce0a368 WatchSource:0}: Error finding container 8f67eb32913dc4b8e9ccac7ce2fe2c8a99b0a4230b2df67a12dc7feedce0a368: Status 404 returned error can't find the container with id 8f67eb32913dc4b8e9ccac7ce2fe2c8a99b0a4230b2df67a12dc7feedce0a368 Apr 28 19:18:38.123094 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:38.123063 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-wl476"] Apr 28 19:18:38.130190 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:18:38.130166 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd16fc61d_1bc8_4386_8d52_c516641eb2f9.slice/crio-387ec604d521a06bf53ae3eb974124cd2a16f39a6261aaeb1c9c8a701b561ad8 WatchSource:0}: Error finding container 387ec604d521a06bf53ae3eb974124cd2a16f39a6261aaeb1c9c8a701b561ad8: Status 404 returned error can't find the container with id 387ec604d521a06bf53ae3eb974124cd2a16f39a6261aaeb1c9c8a701b561ad8 Apr 28 19:18:38.590028 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:38.589989 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wl476" event={"ID":"d16fc61d-1bc8-4386-8d52-c516641eb2f9","Type":"ContainerStarted","Data":"387ec604d521a06bf53ae3eb974124cd2a16f39a6261aaeb1c9c8a701b561ad8"} Apr 28 19:18:38.591092 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:38.591062 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4h69b" event={"ID":"8bf92c20-e551-497f-995e-ea716db91e5d","Type":"ContainerStarted","Data":"8f67eb32913dc4b8e9ccac7ce2fe2c8a99b0a4230b2df67a12dc7feedce0a368"} Apr 28 19:18:40.598923 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:40.598876 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wl476" event={"ID":"d16fc61d-1bc8-4386-8d52-c516641eb2f9","Type":"ContainerStarted","Data":"bd8fa84f44bf00127c4fc1b4354736fc887d46e99bc4b0d7657636665174929f"} Apr 28 19:18:40.600198 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:40.600171 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4h69b" event={"ID":"8bf92c20-e551-497f-995e-ea716db91e5d","Type":"ContainerStarted","Data":"3326d65c3f442edc9ecde4d6d6152265a3f392228bd8f61a1d6bdd5f9f3997d2"} Apr 28 19:18:40.618939 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:40.618876 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wl476" podStartSLOduration=33.690448822 podStartE2EDuration="35.618864581s" podCreationTimestamp="2026-04-28 19:18:05 +0000 UTC" firstStartedPulling="2026-04-28 19:18:38.131922679 +0000 UTC m=+175.531667735" lastFinishedPulling="2026-04-28 19:18:40.060338436 +0000 UTC m=+177.460083494" observedRunningTime="2026-04-28 19:18:40.618137355 +0000 UTC m=+178.017882432" watchObservedRunningTime="2026-04-28 19:18:40.618864581 +0000 UTC m=+178.018609656" Apr 28 19:18:40.639156 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:40.639077 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4h69b" podStartSLOduration=33.313038169 podStartE2EDuration="35.639064843s" podCreationTimestamp="2026-04-28 19:18:05 +0000 UTC" firstStartedPulling="2026-04-28 19:18:38.04942675 +0000 UTC m=+175.449171806" lastFinishedPulling="2026-04-28 19:18:40.375453424 +0000 UTC m=+177.775198480" observedRunningTime="2026-04-28 19:18:40.638235775 +0000 UTC m=+178.037980865" watchObservedRunningTime="2026-04-28 19:18:40.639064843 +0000 UTC m=+178.038809920" Apr 28 19:18:44.739401 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:44.739352 2565 patch_prober.go:28] interesting pod/image-registry-7c9bd88988-mljkv container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 28 19:18:44.739766 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:44.739415 2565 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-7c9bd88988-mljkv" podUID="68018594-cd79-418d-92f7-ff2244ebff00" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:18:47.120288 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:47.120254 2565 scope.go:117] "RemoveContainer" containerID="4881ab008b562290edc8426e083fc0db542fd6250c03e7a8c3561276008ab36d" Apr 28 19:18:47.120665 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:18:47.120496 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-z897l_openshift-console-operator(58244bb4-99cf-41d7-91d2-e3c4ffe45e20)\"" pod="openshift-console-operator/console-operator-9d4b6777b-z897l" podUID="58244bb4-99cf-41d7-91d2-e3c4ffe45e20" Apr 28 19:18:47.556344 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:47.556319 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7c9bd88988-mljkv" Apr 28 19:18:48.398985 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.398952 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-sptmr"] Apr 28 19:18:48.403861 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.403842 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sptmr" Apr 28 19:18:48.407437 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.407401 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 28 19:18:48.407562 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.407412 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 28 19:18:48.407562 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.407460 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 28 19:18:48.407685 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.407615 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-m2mkd\"" Apr 28 19:18:48.412433 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.412410 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-sptmr"] Apr 28 19:18:48.437170 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.437147 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-n8zk8"] Apr 28 19:18:48.440736 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.440712 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-n8zk8" Apr 28 19:18:48.443019 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.442748 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 28 19:18:48.443019 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.442813 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 28 19:18:48.443019 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.442752 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-xgvrs\"" Apr 28 19:18:48.443398 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.443379 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 28 19:18:48.480481 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.480457 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ea906e01-c15b-4bcc-bec1-c9d43cdd11f3-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-sptmr\" (UID: \"ea906e01-c15b-4bcc-bec1-c9d43cdd11f3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sptmr" Apr 28 19:18:48.480586 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.480513 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ea906e01-c15b-4bcc-bec1-c9d43cdd11f3-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-sptmr\" (UID: \"ea906e01-c15b-4bcc-bec1-c9d43cdd11f3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sptmr" Apr 28 19:18:48.480725 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.480683 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ea906e01-c15b-4bcc-bec1-c9d43cdd11f3-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-sptmr\" (UID: \"ea906e01-c15b-4bcc-bec1-c9d43cdd11f3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sptmr" Apr 28 19:18:48.480809 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.480765 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2qt9\" (UniqueName: \"kubernetes.io/projected/ea906e01-c15b-4bcc-bec1-c9d43cdd11f3-kube-api-access-s2qt9\") pod \"openshift-state-metrics-9d44df66c-sptmr\" (UID: \"ea906e01-c15b-4bcc-bec1-c9d43cdd11f3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sptmr" Apr 28 19:18:48.582058 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.582031 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/801be666-fce3-4981-98a0-f1f3b5b08af0-node-exporter-tls\") pod \"node-exporter-n8zk8\" (UID: \"801be666-fce3-4981-98a0-f1f3b5b08af0\") " pod="openshift-monitoring/node-exporter-n8zk8" Apr 28 19:18:48.582189 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.582079 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdd6d\" (UniqueName: \"kubernetes.io/projected/801be666-fce3-4981-98a0-f1f3b5b08af0-kube-api-access-jdd6d\") pod \"node-exporter-n8zk8\" (UID: \"801be666-fce3-4981-98a0-f1f3b5b08af0\") " pod="openshift-monitoring/node-exporter-n8zk8" Apr 28 19:18:48.582189 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.582114 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/801be666-fce3-4981-98a0-f1f3b5b08af0-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-n8zk8\" (UID: \"801be666-fce3-4981-98a0-f1f3b5b08af0\") " pod="openshift-monitoring/node-exporter-n8zk8" Apr 28 19:18:48.582189 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.582138 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/801be666-fce3-4981-98a0-f1f3b5b08af0-sys\") pod \"node-exporter-n8zk8\" (UID: \"801be666-fce3-4981-98a0-f1f3b5b08af0\") " pod="openshift-monitoring/node-exporter-n8zk8" Apr 28 19:18:48.582189 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.582151 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/801be666-fce3-4981-98a0-f1f3b5b08af0-root\") pod \"node-exporter-n8zk8\" (UID: \"801be666-fce3-4981-98a0-f1f3b5b08af0\") " pod="openshift-monitoring/node-exporter-n8zk8" Apr 28 19:18:48.582189 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.582183 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ea906e01-c15b-4bcc-bec1-c9d43cdd11f3-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-sptmr\" (UID: \"ea906e01-c15b-4bcc-bec1-c9d43cdd11f3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sptmr" Apr 28 19:18:48.582384 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.582248 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s2qt9\" (UniqueName: \"kubernetes.io/projected/ea906e01-c15b-4bcc-bec1-c9d43cdd11f3-kube-api-access-s2qt9\") pod \"openshift-state-metrics-9d44df66c-sptmr\" (UID: \"ea906e01-c15b-4bcc-bec1-c9d43cdd11f3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sptmr" Apr 28 19:18:48.582384 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.582296 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/801be666-fce3-4981-98a0-f1f3b5b08af0-node-exporter-textfile\") pod \"node-exporter-n8zk8\" (UID: \"801be666-fce3-4981-98a0-f1f3b5b08af0\") " pod="openshift-monitoring/node-exporter-n8zk8" Apr 28 19:18:48.582384 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.582315 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/801be666-fce3-4981-98a0-f1f3b5b08af0-node-exporter-wtmp\") pod \"node-exporter-n8zk8\" (UID: \"801be666-fce3-4981-98a0-f1f3b5b08af0\") " pod="openshift-monitoring/node-exporter-n8zk8" Apr 28 19:18:48.582384 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.582351 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ea906e01-c15b-4bcc-bec1-c9d43cdd11f3-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-sptmr\" (UID: \"ea906e01-c15b-4bcc-bec1-c9d43cdd11f3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sptmr" Apr 28 19:18:48.582559 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.582396 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ea906e01-c15b-4bcc-bec1-c9d43cdd11f3-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-sptmr\" (UID: \"ea906e01-c15b-4bcc-bec1-c9d43cdd11f3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sptmr" Apr 28 19:18:48.582559 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.582431 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/801be666-fce3-4981-98a0-f1f3b5b08af0-node-exporter-accelerators-collector-config\") pod \"node-exporter-n8zk8\" (UID: \"801be666-fce3-4981-98a0-f1f3b5b08af0\") " pod="openshift-monitoring/node-exporter-n8zk8" Apr 28 19:18:48.582559 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.582460 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/801be666-fce3-4981-98a0-f1f3b5b08af0-metrics-client-ca\") pod \"node-exporter-n8zk8\" (UID: \"801be666-fce3-4981-98a0-f1f3b5b08af0\") " pod="openshift-monitoring/node-exporter-n8zk8" Apr 28 19:18:48.582914 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.582869 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ea906e01-c15b-4bcc-bec1-c9d43cdd11f3-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-sptmr\" (UID: \"ea906e01-c15b-4bcc-bec1-c9d43cdd11f3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sptmr" Apr 28 19:18:48.584792 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.584772 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ea906e01-c15b-4bcc-bec1-c9d43cdd11f3-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-sptmr\" (UID: \"ea906e01-c15b-4bcc-bec1-c9d43cdd11f3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sptmr" Apr 28 19:18:48.584888 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.584867 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ea906e01-c15b-4bcc-bec1-c9d43cdd11f3-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-sptmr\" (UID: \"ea906e01-c15b-4bcc-bec1-c9d43cdd11f3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sptmr" Apr 28 19:18:48.593738 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.593718 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2qt9\" (UniqueName: \"kubernetes.io/projected/ea906e01-c15b-4bcc-bec1-c9d43cdd11f3-kube-api-access-s2qt9\") pod \"openshift-state-metrics-9d44df66c-sptmr\" (UID: \"ea906e01-c15b-4bcc-bec1-c9d43cdd11f3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sptmr" Apr 28 19:18:48.682875 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.682807 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/801be666-fce3-4981-98a0-f1f3b5b08af0-node-exporter-accelerators-collector-config\") pod \"node-exporter-n8zk8\" (UID: \"801be666-fce3-4981-98a0-f1f3b5b08af0\") " pod="openshift-monitoring/node-exporter-n8zk8" Apr 28 19:18:48.682875 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.682843 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/801be666-fce3-4981-98a0-f1f3b5b08af0-metrics-client-ca\") pod \"node-exporter-n8zk8\" (UID: \"801be666-fce3-4981-98a0-f1f3b5b08af0\") " pod="openshift-monitoring/node-exporter-n8zk8" Apr 28 19:18:48.683058 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.682876 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/801be666-fce3-4981-98a0-f1f3b5b08af0-node-exporter-tls\") pod \"node-exporter-n8zk8\" (UID: \"801be666-fce3-4981-98a0-f1f3b5b08af0\") " pod="openshift-monitoring/node-exporter-n8zk8" Apr 28 19:18:48.683058 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.682944 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jdd6d\" (UniqueName: \"kubernetes.io/projected/801be666-fce3-4981-98a0-f1f3b5b08af0-kube-api-access-jdd6d\") pod \"node-exporter-n8zk8\" (UID: \"801be666-fce3-4981-98a0-f1f3b5b08af0\") " pod="openshift-monitoring/node-exporter-n8zk8" Apr 28 19:18:48.683058 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.682972 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/801be666-fce3-4981-98a0-f1f3b5b08af0-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-n8zk8\" (UID: \"801be666-fce3-4981-98a0-f1f3b5b08af0\") " pod="openshift-monitoring/node-exporter-n8zk8" Apr 28 19:18:48.683058 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.683007 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/801be666-fce3-4981-98a0-f1f3b5b08af0-sys\") pod \"node-exporter-n8zk8\" (UID: \"801be666-fce3-4981-98a0-f1f3b5b08af0\") " pod="openshift-monitoring/node-exporter-n8zk8" Apr 28 19:18:48.683058 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.683031 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/801be666-fce3-4981-98a0-f1f3b5b08af0-root\") pod \"node-exporter-n8zk8\" (UID: \"801be666-fce3-4981-98a0-f1f3b5b08af0\") " pod="openshift-monitoring/node-exporter-n8zk8" Apr 28 19:18:48.683290 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.683112 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/801be666-fce3-4981-98a0-f1f3b5b08af0-node-exporter-textfile\") pod \"node-exporter-n8zk8\" (UID: \"801be666-fce3-4981-98a0-f1f3b5b08af0\") " pod="openshift-monitoring/node-exporter-n8zk8" Apr 28 19:18:48.683290 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.683139 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/801be666-fce3-4981-98a0-f1f3b5b08af0-node-exporter-wtmp\") pod \"node-exporter-n8zk8\" (UID: \"801be666-fce3-4981-98a0-f1f3b5b08af0\") " pod="openshift-monitoring/node-exporter-n8zk8" Apr 28 19:18:48.683290 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.683229 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/801be666-fce3-4981-98a0-f1f3b5b08af0-sys\") pod \"node-exporter-n8zk8\" (UID: \"801be666-fce3-4981-98a0-f1f3b5b08af0\") " pod="openshift-monitoring/node-exporter-n8zk8" Apr 28 19:18:48.683290 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.683275 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/801be666-fce3-4981-98a0-f1f3b5b08af0-root\") pod \"node-exporter-n8zk8\" (UID: \"801be666-fce3-4981-98a0-f1f3b5b08af0\") " pod="openshift-monitoring/node-exporter-n8zk8" Apr 28 19:18:48.683480 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.683397 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/801be666-fce3-4981-98a0-f1f3b5b08af0-node-exporter-wtmp\") pod \"node-exporter-n8zk8\" (UID: \"801be666-fce3-4981-98a0-f1f3b5b08af0\") " pod="openshift-monitoring/node-exporter-n8zk8" Apr 28 19:18:48.683526 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.683508 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/801be666-fce3-4981-98a0-f1f3b5b08af0-node-exporter-accelerators-collector-config\") pod \"node-exporter-n8zk8\" (UID: \"801be666-fce3-4981-98a0-f1f3b5b08af0\") " pod="openshift-monitoring/node-exporter-n8zk8" Apr 28 19:18:48.683715 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.683696 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/801be666-fce3-4981-98a0-f1f3b5b08af0-node-exporter-textfile\") pod \"node-exporter-n8zk8\" (UID: \"801be666-fce3-4981-98a0-f1f3b5b08af0\") " pod="openshift-monitoring/node-exporter-n8zk8" Apr 28 19:18:48.683836 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.683820 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/801be666-fce3-4981-98a0-f1f3b5b08af0-metrics-client-ca\") pod \"node-exporter-n8zk8\" (UID: \"801be666-fce3-4981-98a0-f1f3b5b08af0\") " pod="openshift-monitoring/node-exporter-n8zk8" Apr 28 19:18:48.685339 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.685314 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/801be666-fce3-4981-98a0-f1f3b5b08af0-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-n8zk8\" (UID: \"801be666-fce3-4981-98a0-f1f3b5b08af0\") " pod="openshift-monitoring/node-exporter-n8zk8" Apr 28 19:18:48.685551 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.685532 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/801be666-fce3-4981-98a0-f1f3b5b08af0-node-exporter-tls\") pod \"node-exporter-n8zk8\" (UID: \"801be666-fce3-4981-98a0-f1f3b5b08af0\") " pod="openshift-monitoring/node-exporter-n8zk8" Apr 28 19:18:48.690186 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.690163 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdd6d\" (UniqueName: \"kubernetes.io/projected/801be666-fce3-4981-98a0-f1f3b5b08af0-kube-api-access-jdd6d\") pod \"node-exporter-n8zk8\" (UID: \"801be666-fce3-4981-98a0-f1f3b5b08af0\") " pod="openshift-monitoring/node-exporter-n8zk8" Apr 28 19:18:48.716029 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.716005 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sptmr" Apr 28 19:18:48.751265 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.751192 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-n8zk8" Apr 28 19:18:48.762035 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:18:48.762003 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod801be666_fce3_4981_98a0_f1f3b5b08af0.slice/crio-0ed4eb1462b8fff9c137d9bec01096efed0623926098d86a54646dd2506c885c WatchSource:0}: Error finding container 0ed4eb1462b8fff9c137d9bec01096efed0623926098d86a54646dd2506c885c: Status 404 returned error can't find the container with id 0ed4eb1462b8fff9c137d9bec01096efed0623926098d86a54646dd2506c885c Apr 28 19:18:48.844427 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:48.844403 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-sptmr"] Apr 28 19:18:48.846875 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:18:48.846849 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea906e01_c15b_4bcc_bec1_c9d43cdd11f3.slice/crio-a16c0b47d7ebe38f1d66e1cef4ddff78c87ea664f5b0399efdb14fd9b7f530ad WatchSource:0}: Error finding container a16c0b47d7ebe38f1d66e1cef4ddff78c87ea664f5b0399efdb14fd9b7f530ad: Status 404 returned error can't find the container with id a16c0b47d7ebe38f1d66e1cef4ddff78c87ea664f5b0399efdb14fd9b7f530ad Apr 28 19:18:49.526986 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.526649 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 28 19:18:49.532110 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.532087 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:49.534255 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.534228 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 28 19:18:49.534454 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.534429 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 28 19:18:49.534570 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.534485 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 28 19:18:49.534570 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.534543 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 28 19:18:49.535000 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.534740 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-8sm7b\"" Apr 28 19:18:49.535000 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.534828 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 28 19:18:49.535000 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.534853 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 28 19:18:49.535000 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.534877 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 28 19:18:49.535000 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.534959 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 28 19:18:49.535290 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.535151 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 28 19:18:49.548939 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.548867 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 28 19:18:49.626158 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.626129 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n8zk8" event={"ID":"801be666-fce3-4981-98a0-f1f3b5b08af0","Type":"ContainerStarted","Data":"65f48e15b8bc22ab80fe24e94a06bf0553507fb01914cfd4e76cb5161ed6be38"} Apr 28 19:18:49.626282 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.626169 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n8zk8" event={"ID":"801be666-fce3-4981-98a0-f1f3b5b08af0","Type":"ContainerStarted","Data":"0ed4eb1462b8fff9c137d9bec01096efed0623926098d86a54646dd2506c885c"} Apr 28 19:18:49.627909 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.627867 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sptmr" event={"ID":"ea906e01-c15b-4bcc-bec1-c9d43cdd11f3","Type":"ContainerStarted","Data":"6080b101b71dc178f840886dd11d525eacfc5d94975ca2875adb652b9878e3bc"} Apr 28 19:18:49.628011 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.627914 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sptmr" event={"ID":"ea906e01-c15b-4bcc-bec1-c9d43cdd11f3","Type":"ContainerStarted","Data":"0f58090ef185c0c65f1e45d38cc3e156712968658598400902f33eb3278d597a"} Apr 28 19:18:49.628011 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.627927 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sptmr" event={"ID":"ea906e01-c15b-4bcc-bec1-c9d43cdd11f3","Type":"ContainerStarted","Data":"a16c0b47d7ebe38f1d66e1cef4ddff78c87ea664f5b0399efdb14fd9b7f530ad"} Apr 28 19:18:49.692941 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.692890 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-tls-assets\") pod \"alertmanager-main-0\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:49.693065 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.692950 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:49.693065 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.692977 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-config-volume\") pod \"alertmanager-main-0\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:49.693065 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.693048 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:49.693226 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.693097 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:49.693226 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.693140 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:49.693330 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.693222 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvvwq\" (UniqueName: \"kubernetes.io/projected/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-kube-api-access-qvvwq\") pod \"alertmanager-main-0\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:49.693330 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.693269 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:49.693421 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.693327 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:49.693421 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.693397 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-config-out\") pod \"alertmanager-main-0\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:49.693522 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.693430 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-web-config\") pod \"alertmanager-main-0\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:49.693522 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.693461 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:49.693522 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.693507 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:49.794255 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.794205 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-tls-assets\") pod \"alertmanager-main-0\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:49.794255 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.794253 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:49.794506 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.794281 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-config-volume\") pod \"alertmanager-main-0\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:49.794506 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.794322 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:49.794506 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.794350 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:49.794506 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.794386 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:49.794506 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.794453 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qvvwq\" (UniqueName: \"kubernetes.io/projected/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-kube-api-access-qvvwq\") pod \"alertmanager-main-0\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:49.794506 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.794488 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:49.794816 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.794528 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:49.794816 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.794564 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-config-out\") pod \"alertmanager-main-0\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:49.794816 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.794595 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-web-config\") pod \"alertmanager-main-0\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:49.794816 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.794625 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:49.794816 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.794665 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:49.795500 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.795468 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:49.795867 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.795691 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:49.796030 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.795482 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:49.797930 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.797871 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-tls-assets\") pod \"alertmanager-main-0\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:49.799191 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.799167 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:49.799288 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.799247 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-config-out\") pod \"alertmanager-main-0\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:49.799405 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.799380 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-web-config\") pod \"alertmanager-main-0\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:49.799563 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.799546 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:49.800276 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.800231 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-config-volume\") pod \"alertmanager-main-0\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:49.800372 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.800302 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:49.800618 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.800589 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:49.801625 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.801600 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:49.803672 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.803650 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvvwq\" (UniqueName: \"kubernetes.io/projected/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-kube-api-access-qvvwq\") pod \"alertmanager-main-0\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:49.849631 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:49.849606 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:18:50.122482 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:50.122451 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 28 19:18:50.126486 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:18:50.126458 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ad1cd3f_cb65_4c84_82de_9ca7210a9f70.slice/crio-763ea0797fb788995c2b60d78b8e8fb4543106b01b25dd1cddd86422bf4570e3 WatchSource:0}: Error finding container 763ea0797fb788995c2b60d78b8e8fb4543106b01b25dd1cddd86422bf4570e3: Status 404 returned error can't find the container with id 763ea0797fb788995c2b60d78b8e8fb4543106b01b25dd1cddd86422bf4570e3 Apr 28 19:18:50.635237 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:50.635200 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sptmr" event={"ID":"ea906e01-c15b-4bcc-bec1-c9d43cdd11f3","Type":"ContainerStarted","Data":"679dd809ff1d622b365ba72b15b3fdf84c66415a260a4ba1e6e75b035437ffb8"} Apr 28 19:18:50.636512 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:50.636483 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70","Type":"ContainerStarted","Data":"763ea0797fb788995c2b60d78b8e8fb4543106b01b25dd1cddd86422bf4570e3"} Apr 28 19:18:50.637787 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:50.637758 2565 generic.go:358] "Generic (PLEG): container finished" podID="801be666-fce3-4981-98a0-f1f3b5b08af0" containerID="65f48e15b8bc22ab80fe24e94a06bf0553507fb01914cfd4e76cb5161ed6be38" exitCode=0 Apr 28 19:18:50.637965 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:50.637806 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n8zk8" event={"ID":"801be666-fce3-4981-98a0-f1f3b5b08af0","Type":"ContainerDied","Data":"65f48e15b8bc22ab80fe24e94a06bf0553507fb01914cfd4e76cb5161ed6be38"} Apr 28 19:18:50.684599 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:50.684552 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-sptmr" podStartSLOduration=1.664057264 podStartE2EDuration="2.684537798s" podCreationTimestamp="2026-04-28 19:18:48 +0000 UTC" firstStartedPulling="2026-04-28 19:18:48.973948424 +0000 UTC m=+186.373693481" lastFinishedPulling="2026-04-28 19:18:49.994428945 +0000 UTC m=+187.394174015" observedRunningTime="2026-04-28 19:18:50.68409962 +0000 UTC m=+188.083844698" watchObservedRunningTime="2026-04-28 19:18:50.684537798 +0000 UTC m=+188.084282876" Apr 28 19:18:51.647146 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:51.647112 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n8zk8" event={"ID":"801be666-fce3-4981-98a0-f1f3b5b08af0","Type":"ContainerStarted","Data":"119da3f7a908ac3c5a7acc54d07c95678e9d8020f051960e3e1ac56a1d45d078"} Apr 28 19:18:51.647532 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:51.647161 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n8zk8" event={"ID":"801be666-fce3-4981-98a0-f1f3b5b08af0","Type":"ContainerStarted","Data":"60caa973b0d393a9c6771d0f449dccbc3549f6e787db982961e42be5dbd4d8a1"} Apr 28 19:18:51.648329 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:51.648308 2565 generic.go:358] "Generic (PLEG): container finished" podID="4ad1cd3f-cb65-4c84-82de-9ca7210a9f70" containerID="62ef78268d287d365a910ca3ce7e20524522d2d89ba2907d7e78cc7cd6287094" exitCode=0 Apr 28 19:18:51.648409 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:51.648372 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70","Type":"ContainerDied","Data":"62ef78268d287d365a910ca3ce7e20524522d2d89ba2907d7e78cc7cd6287094"} Apr 28 19:18:51.680751 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:51.680702 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-n8zk8" podStartSLOduration=2.944613406 podStartE2EDuration="3.680688858s" podCreationTimestamp="2026-04-28 19:18:48 +0000 UTC" firstStartedPulling="2026-04-28 19:18:48.763715651 +0000 UTC m=+186.163460707" lastFinishedPulling="2026-04-28 19:18:49.499791096 +0000 UTC m=+186.899536159" observedRunningTime="2026-04-28 19:18:51.678719813 +0000 UTC m=+189.078464891" watchObservedRunningTime="2026-04-28 19:18:51.680688858 +0000 UTC m=+189.080433935" Apr 28 19:18:53.215698 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:53.215676 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-6vnbd"] Apr 28 19:18:53.220441 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:53.220424 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6vnbd" Apr 28 19:18:53.223325 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:53.223305 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 28 19:18:53.223416 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:53.223359 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-x79lj\"" Apr 28 19:18:53.229731 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:53.229704 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-6vnbd"] Apr 28 19:18:53.347519 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:53.347490 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7823617f-5797-402d-ae71-da8ae44f45c6-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-6vnbd\" (UID: \"7823617f-5797-402d-ae71-da8ae44f45c6\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6vnbd" Apr 28 19:18:53.448522 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:53.448491 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7823617f-5797-402d-ae71-da8ae44f45c6-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-6vnbd\" (UID: \"7823617f-5797-402d-ae71-da8ae44f45c6\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6vnbd" Apr 28 19:18:53.448649 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:18:53.448634 2565 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 28 19:18:53.448705 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:18:53.448695 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7823617f-5797-402d-ae71-da8ae44f45c6-monitoring-plugin-cert podName:7823617f-5797-402d-ae71-da8ae44f45c6 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:53.948680398 +0000 UTC m=+191.348425455 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/7823617f-5797-402d-ae71-da8ae44f45c6-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-6vnbd" (UID: "7823617f-5797-402d-ae71-da8ae44f45c6") : secret "monitoring-plugin-cert" not found Apr 28 19:18:53.657062 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:53.656989 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70","Type":"ContainerStarted","Data":"8dee541e627cdb5e7ed89e7bb767a29456f1686ac551e55900a252b88e25344f"} Apr 28 19:18:53.657062 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:53.657022 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70","Type":"ContainerStarted","Data":"143b853de2df31d8f9c6a26edadaf7e75a00553a93467699cacc2e2384677ad7"} Apr 28 19:18:53.657062 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:53.657033 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70","Type":"ContainerStarted","Data":"d56627fcdf0aa95836a114393eab6ede902be4ce038f4e40756525bd8c1f5837"} Apr 28 19:18:53.657062 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:53.657042 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70","Type":"ContainerStarted","Data":"3f3b999d0e56b911d6dadd13a588e79776b4d957ac0a814539379bd8c621c2e1"} Apr 28 19:18:53.657062 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:53.657052 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70","Type":"ContainerStarted","Data":"adcac847e5bde97cde0abfa9a10f4c6c4c8c1e5e5940c7ac50d4f81ea6f6c76a"} Apr 28 19:18:53.953204 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:53.953132 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7823617f-5797-402d-ae71-da8ae44f45c6-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-6vnbd\" (UID: \"7823617f-5797-402d-ae71-da8ae44f45c6\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6vnbd" Apr 28 19:18:53.955729 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:53.955700 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7823617f-5797-402d-ae71-da8ae44f45c6-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-6vnbd\" (UID: \"7823617f-5797-402d-ae71-da8ae44f45c6\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6vnbd" Apr 28 19:18:54.138982 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:54.138950 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6vnbd" Apr 28 19:18:54.298493 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:54.298471 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-6vnbd"] Apr 28 19:18:54.301024 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:18:54.300998 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7823617f_5797_402d_ae71_da8ae44f45c6.slice/crio-4b87cf6c2d0a66a51644e96e6afac67820284803ffd283e2d1af57da97e51391 WatchSource:0}: Error finding container 4b87cf6c2d0a66a51644e96e6afac67820284803ffd283e2d1af57da97e51391: Status 404 returned error can't find the container with id 4b87cf6c2d0a66a51644e96e6afac67820284803ffd283e2d1af57da97e51391 Apr 28 19:18:54.660675 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:54.660637 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6vnbd" event={"ID":"7823617f-5797-402d-ae71-da8ae44f45c6","Type":"ContainerStarted","Data":"4b87cf6c2d0a66a51644e96e6afac67820284803ffd283e2d1af57da97e51391"} Apr 28 19:18:54.663235 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:54.663208 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70","Type":"ContainerStarted","Data":"bfe288c7b104fc36ef454a681e4431b7fa4e5b95760d5a64863be06c3caaf5a9"} Apr 28 19:18:54.701412 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:54.701365 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.605700605 podStartE2EDuration="5.701351603s" podCreationTimestamp="2026-04-28 19:18:49 +0000 UTC" firstStartedPulling="2026-04-28 19:18:50.129106794 +0000 UTC m=+187.528851849" lastFinishedPulling="2026-04-28 19:18:54.224757787 +0000 UTC m=+191.624502847" observedRunningTime="2026-04-28 19:18:54.699776778 +0000 UTC m=+192.099521857" watchObservedRunningTime="2026-04-28 19:18:54.701351603 +0000 UTC m=+192.101096680" Apr 28 19:18:55.138120 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.138088 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 28 19:18:55.142166 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.142135 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.146284 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.146255 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 28 19:18:55.146524 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.146392 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 28 19:18:55.146524 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.146515 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-n8tqr\"" Apr 28 19:18:55.146655 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.146586 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 28 19:18:55.147403 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.147218 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 28 19:18:55.147403 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.147325 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 28 19:18:55.148210 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.148186 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 28 19:18:55.149934 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.149911 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 28 19:18:55.150052 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.150035 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 28 19:18:55.150693 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.150232 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 28 19:18:55.151214 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.151198 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-1oeck03afes0m\"" Apr 28 19:18:55.151699 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.151683 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 28 19:18:55.153604 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.153586 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 28 19:18:55.153777 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.153765 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 28 19:18:55.163644 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.163611 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cda13ef8-dc6e-4c55-9449-4617c04114b5-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.163733 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.163664 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.163733 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.163692 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/cda13ef8-dc6e-4c55-9449-4617c04114b5-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.163733 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.163716 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cda13ef8-dc6e-4c55-9449-4617c04114b5-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.163877 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.163737 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.163877 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.163752 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvksj\" (UniqueName: \"kubernetes.io/projected/cda13ef8-dc6e-4c55-9449-4617c04114b5-kube-api-access-wvksj\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.163877 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.163775 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cda13ef8-dc6e-4c55-9449-4617c04114b5-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.164042 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.163882 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-web-config\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.164042 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.163961 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cda13ef8-dc6e-4c55-9449-4617c04114b5-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.164042 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.163996 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.164042 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.164038 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.164223 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.164072 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-config\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.164223 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.164099 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.164223 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.164140 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cda13ef8-dc6e-4c55-9449-4617c04114b5-config-out\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.164223 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.164170 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cda13ef8-dc6e-4c55-9449-4617c04114b5-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.164345 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.164247 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cda13ef8-dc6e-4c55-9449-4617c04114b5-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.164345 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.164307 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.164345 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.164337 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.168499 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.168478 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 28 19:18:55.248537 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.248504 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 28 19:18:55.265591 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.265564 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.265711 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.265597 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.265711 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.265618 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-config\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.265711 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.265635 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.265711 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.265659 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cda13ef8-dc6e-4c55-9449-4617c04114b5-config-out\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.265711 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.265686 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cda13ef8-dc6e-4c55-9449-4617c04114b5-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.265711 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.265712 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cda13ef8-dc6e-4c55-9449-4617c04114b5-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.266033 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.265742 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.266033 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.265765 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.266033 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.265785 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cda13ef8-dc6e-4c55-9449-4617c04114b5-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.266033 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.265812 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.266033 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.265834 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/cda13ef8-dc6e-4c55-9449-4617c04114b5-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.266033 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.265857 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cda13ef8-dc6e-4c55-9449-4617c04114b5-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.266033 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.265881 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.266033 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.265912 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wvksj\" (UniqueName: \"kubernetes.io/projected/cda13ef8-dc6e-4c55-9449-4617c04114b5-kube-api-access-wvksj\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.266033 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.265930 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cda13ef8-dc6e-4c55-9449-4617c04114b5-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.266033 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.265959 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-web-config\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.266033 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.265978 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cda13ef8-dc6e-4c55-9449-4617c04114b5-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.266965 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.266764 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cda13ef8-dc6e-4c55-9449-4617c04114b5-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.266965 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.266790 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/cda13ef8-dc6e-4c55-9449-4617c04114b5-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.269255 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.268949 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.270529 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.269439 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cda13ef8-dc6e-4c55-9449-4617c04114b5-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.270529 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.270112 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cda13ef8-dc6e-4c55-9449-4617c04114b5-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.270529 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.270437 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.270529 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.270493 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-web-config\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.271189 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.271145 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-config\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.271321 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.271297 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cda13ef8-dc6e-4c55-9449-4617c04114b5-config-out\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.271464 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.271438 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.271593 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.271567 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cda13ef8-dc6e-4c55-9449-4617c04114b5-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.272346 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.272322 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.272602 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.272580 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.272856 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.272814 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cda13ef8-dc6e-4c55-9449-4617c04114b5-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.272973 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.272915 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.273412 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.273389 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cda13ef8-dc6e-4c55-9449-4617c04114b5-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.274430 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.274410 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.279450 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.279431 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvksj\" (UniqueName: \"kubernetes.io/projected/cda13ef8-dc6e-4c55-9449-4617c04114b5-kube-api-access-wvksj\") pod \"prometheus-k8s-0\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.453412 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.453337 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:18:55.547102 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.547074 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7c9bd88988-mljkv"] Apr 28 19:18:55.625258 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.625232 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 28 19:18:55.627878 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:18:55.627843 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcda13ef8_dc6e_4c55_9449_4617c04114b5.slice/crio-2a509f7f70147e4b5f9c4b55046e76a2e39d66912f4a8a81578a90295938d340 WatchSource:0}: Error finding container 2a509f7f70147e4b5f9c4b55046e76a2e39d66912f4a8a81578a90295938d340: Status 404 returned error can't find the container with id 2a509f7f70147e4b5f9c4b55046e76a2e39d66912f4a8a81578a90295938d340 Apr 28 19:18:55.667143 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.667111 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6vnbd" event={"ID":"7823617f-5797-402d-ae71-da8ae44f45c6","Type":"ContainerStarted","Data":"96d4c6186a0df9f5bfcd3c687ac6bff44c338156ca15f4817f081b97ae5cd0a3"} Apr 28 19:18:55.667337 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.667300 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6vnbd" Apr 28 19:18:55.668301 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.668277 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cda13ef8-dc6e-4c55-9449-4617c04114b5","Type":"ContainerStarted","Data":"2a509f7f70147e4b5f9c4b55046e76a2e39d66912f4a8a81578a90295938d340"} Apr 28 19:18:55.672579 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.672555 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6vnbd" Apr 28 19:18:55.685909 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:55.685864 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6vnbd" podStartSLOduration=1.446196752 podStartE2EDuration="2.685851785s" podCreationTimestamp="2026-04-28 19:18:53 +0000 UTC" firstStartedPulling="2026-04-28 19:18:54.302745899 +0000 UTC m=+191.702490955" lastFinishedPulling="2026-04-28 19:18:55.542400932 +0000 UTC m=+192.942145988" observedRunningTime="2026-04-28 19:18:55.685609107 +0000 UTC m=+193.085354176" watchObservedRunningTime="2026-04-28 19:18:55.685851785 +0000 UTC m=+193.085596862" Apr 28 19:18:56.678852 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:56.678814 2565 generic.go:358] "Generic (PLEG): container finished" podID="cda13ef8-dc6e-4c55-9449-4617c04114b5" containerID="5ed81fcd5d5104b0bbfa4d4b227934a838b1eefed8091befd55c59bed5b1f81a" exitCode=0 Apr 28 19:18:56.679230 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:56.678915 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cda13ef8-dc6e-4c55-9449-4617c04114b5","Type":"ContainerDied","Data":"5ed81fcd5d5104b0bbfa4d4b227934a838b1eefed8091befd55c59bed5b1f81a"} Apr 28 19:18:58.120877 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:58.120850 2565 scope.go:117] "RemoveContainer" containerID="4881ab008b562290edc8426e083fc0db542fd6250c03e7a8c3561276008ab36d" Apr 28 19:18:58.687111 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:58.687080 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z897l_58244bb4-99cf-41d7-91d2-e3c4ffe45e20/console-operator/2.log" Apr 28 19:18:58.687290 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:58.687214 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-z897l" event={"ID":"58244bb4-99cf-41d7-91d2-e3c4ffe45e20","Type":"ContainerStarted","Data":"4aea10ca7bdc64babc9b93efc3659aa4676e923f87305102939c03f06287aa44"} Apr 28 19:18:58.687603 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:58.687572 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-z897l" Apr 28 19:18:58.897118 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:58.897088 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-z897l" Apr 28 19:18:58.921355 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:18:58.921307 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-z897l" podStartSLOduration=50.850306087 podStartE2EDuration="53.921293686s" podCreationTimestamp="2026-04-28 19:18:05 +0000 UTC" firstStartedPulling="2026-04-28 19:18:06.152050942 +0000 UTC m=+143.551796000" lastFinishedPulling="2026-04-28 19:18:09.223038543 +0000 UTC m=+146.622783599" observedRunningTime="2026-04-28 19:18:58.720136383 +0000 UTC m=+196.119881461" watchObservedRunningTime="2026-04-28 19:18:58.921293686 +0000 UTC m=+196.321038765" Apr 28 19:19:00.696955 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:00.696837 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cda13ef8-dc6e-4c55-9449-4617c04114b5","Type":"ContainerStarted","Data":"6df1f261365589fef57315b77b2b9254bb29c53956289ab8c2897ee5d714d186"} Apr 28 19:19:00.696955 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:00.696886 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cda13ef8-dc6e-4c55-9449-4617c04114b5","Type":"ContainerStarted","Data":"280bce37e65283ecc036af4224a36cf2a57fe5fc2535712764d6b23a03718791"} Apr 28 19:19:02.705091 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:02.705055 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cda13ef8-dc6e-4c55-9449-4617c04114b5","Type":"ContainerStarted","Data":"109fd527fd25608545c428f359deeb1c966f2e4b50ff469476da1415629a8e03"} Apr 28 19:19:02.705091 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:02.705094 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cda13ef8-dc6e-4c55-9449-4617c04114b5","Type":"ContainerStarted","Data":"91faf80ce8795ed00a7cccdd26b58734c642baf415f51264d6a483e7071b3683"} Apr 28 19:19:02.705479 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:02.705108 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cda13ef8-dc6e-4c55-9449-4617c04114b5","Type":"ContainerStarted","Data":"b541a12388d6eac81a53ff1903770908557c7a5a6bea1e8d489709d95d10f2fe"} Apr 28 19:19:02.705479 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:02.705120 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cda13ef8-dc6e-4c55-9449-4617c04114b5","Type":"ContainerStarted","Data":"790e2919fa10ba5ec8afda0a7969d62bc9ae93389e2db1eb461ed48a8722f5a2"} Apr 28 19:19:02.768213 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:02.768163 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.137416435 podStartE2EDuration="7.768149715s" podCreationTimestamp="2026-04-28 19:18:55 +0000 UTC" firstStartedPulling="2026-04-28 19:18:56.679955531 +0000 UTC m=+194.079700600" lastFinishedPulling="2026-04-28 19:19:02.31068882 +0000 UTC m=+199.710433880" observedRunningTime="2026-04-28 19:19:02.767539761 +0000 UTC m=+200.167284839" watchObservedRunningTime="2026-04-28 19:19:02.768149715 +0000 UTC m=+200.167894793" Apr 28 19:19:05.454322 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:05.454288 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:20.569753 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:20.569703 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-7c9bd88988-mljkv" podUID="68018594-cd79-418d-92f7-ff2244ebff00" containerName="registry" containerID="cri-o://9cd1815ae4b383e2dd1f7791f1940f0c757787daab5fdcc3ee54f5c36be1731c" gracePeriod=30 Apr 28 19:19:20.759205 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:20.759172 2565 generic.go:358] "Generic (PLEG): container finished" podID="3a57a1a7-9bff-45c6-a3ab-4fdacd5a6787" containerID="93d25ebbb0ec8f0ba5874f4d26430fa661c777f949afc903fe619b03d684d52c" exitCode=0 Apr 28 19:19:20.759337 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:20.759252 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-x7t7n" event={"ID":"3a57a1a7-9bff-45c6-a3ab-4fdacd5a6787","Type":"ContainerDied","Data":"93d25ebbb0ec8f0ba5874f4d26430fa661c777f949afc903fe619b03d684d52c"} Apr 28 19:19:20.759687 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:20.759670 2565 scope.go:117] "RemoveContainer" containerID="93d25ebbb0ec8f0ba5874f4d26430fa661c777f949afc903fe619b03d684d52c" Apr 28 19:19:20.760920 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:20.760879 2565 generic.go:358] "Generic (PLEG): container finished" podID="32912883-a580-4985-b771-5a3b8b2d6ab5" containerID="caf7ec225964dab57c8ae3ab5e05e1965e6bab0410cf2ff13de70b90b4e17797" exitCode=0 Apr 28 19:19:20.761018 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:20.760977 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dzfv2" event={"ID":"32912883-a580-4985-b771-5a3b8b2d6ab5","Type":"ContainerDied","Data":"caf7ec225964dab57c8ae3ab5e05e1965e6bab0410cf2ff13de70b90b4e17797"} Apr 28 19:19:20.761375 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:20.761303 2565 scope.go:117] "RemoveContainer" containerID="caf7ec225964dab57c8ae3ab5e05e1965e6bab0410cf2ff13de70b90b4e17797" Apr 28 19:19:20.762708 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:20.762554 2565 generic.go:358] "Generic (PLEG): container finished" podID="68018594-cd79-418d-92f7-ff2244ebff00" containerID="9cd1815ae4b383e2dd1f7791f1940f0c757787daab5fdcc3ee54f5c36be1731c" exitCode=0 Apr 28 19:19:20.762708 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:20.762581 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7c9bd88988-mljkv" event={"ID":"68018594-cd79-418d-92f7-ff2244ebff00","Type":"ContainerDied","Data":"9cd1815ae4b383e2dd1f7791f1940f0c757787daab5fdcc3ee54f5c36be1731c"} Apr 28 19:19:20.808648 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:20.808591 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7c9bd88988-mljkv" Apr 28 19:19:20.872200 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:20.872149 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/68018594-cd79-418d-92f7-ff2244ebff00-ca-trust-extracted\") pod \"68018594-cd79-418d-92f7-ff2244ebff00\" (UID: \"68018594-cd79-418d-92f7-ff2244ebff00\") " Apr 28 19:19:20.872200 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:20.872192 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/68018594-cd79-418d-92f7-ff2244ebff00-registry-certificates\") pod \"68018594-cd79-418d-92f7-ff2244ebff00\" (UID: \"68018594-cd79-418d-92f7-ff2244ebff00\") " Apr 28 19:19:20.872432 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:20.872225 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dks7\" (UniqueName: \"kubernetes.io/projected/68018594-cd79-418d-92f7-ff2244ebff00-kube-api-access-6dks7\") pod \"68018594-cd79-418d-92f7-ff2244ebff00\" (UID: \"68018594-cd79-418d-92f7-ff2244ebff00\") " Apr 28 19:19:20.872432 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:20.872273 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68018594-cd79-418d-92f7-ff2244ebff00-trusted-ca\") pod \"68018594-cd79-418d-92f7-ff2244ebff00\" (UID: \"68018594-cd79-418d-92f7-ff2244ebff00\") " Apr 28 19:19:20.872432 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:20.872317 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/68018594-cd79-418d-92f7-ff2244ebff00-image-registry-private-configuration\") pod \"68018594-cd79-418d-92f7-ff2244ebff00\" (UID: \"68018594-cd79-418d-92f7-ff2244ebff00\") " Apr 28 19:19:20.872432 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:20.872356 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/68018594-cd79-418d-92f7-ff2244ebff00-installation-pull-secrets\") pod \"68018594-cd79-418d-92f7-ff2244ebff00\" (UID: \"68018594-cd79-418d-92f7-ff2244ebff00\") " Apr 28 19:19:20.872432 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:20.872393 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/68018594-cd79-418d-92f7-ff2244ebff00-registry-tls\") pod \"68018594-cd79-418d-92f7-ff2244ebff00\" (UID: \"68018594-cd79-418d-92f7-ff2244ebff00\") " Apr 28 19:19:20.872685 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:20.872444 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/68018594-cd79-418d-92f7-ff2244ebff00-bound-sa-token\") pod \"68018594-cd79-418d-92f7-ff2244ebff00\" (UID: \"68018594-cd79-418d-92f7-ff2244ebff00\") " Apr 28 19:19:20.875069 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:20.875037 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68018594-cd79-418d-92f7-ff2244ebff00-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "68018594-cd79-418d-92f7-ff2244ebff00" (UID: "68018594-cd79-418d-92f7-ff2244ebff00"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:19:20.876220 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:20.876176 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68018594-cd79-418d-92f7-ff2244ebff00-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "68018594-cd79-418d-92f7-ff2244ebff00" (UID: "68018594-cd79-418d-92f7-ff2244ebff00"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:19:20.876344 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:20.876315 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68018594-cd79-418d-92f7-ff2244ebff00-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "68018594-cd79-418d-92f7-ff2244ebff00" (UID: "68018594-cd79-418d-92f7-ff2244ebff00"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:19:20.880226 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:20.878968 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68018594-cd79-418d-92f7-ff2244ebff00-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "68018594-cd79-418d-92f7-ff2244ebff00" (UID: "68018594-cd79-418d-92f7-ff2244ebff00"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:19:20.881176 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:20.881131 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68018594-cd79-418d-92f7-ff2244ebff00-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "68018594-cd79-418d-92f7-ff2244ebff00" (UID: "68018594-cd79-418d-92f7-ff2244ebff00"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:19:20.881557 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:20.881502 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68018594-cd79-418d-92f7-ff2244ebff00-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "68018594-cd79-418d-92f7-ff2244ebff00" (UID: "68018594-cd79-418d-92f7-ff2244ebff00"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:19:20.883552 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:20.883525 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68018594-cd79-418d-92f7-ff2244ebff00-kube-api-access-6dks7" (OuterVolumeSpecName: "kube-api-access-6dks7") pod "68018594-cd79-418d-92f7-ff2244ebff00" (UID: "68018594-cd79-418d-92f7-ff2244ebff00"). InnerVolumeSpecName "kube-api-access-6dks7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:19:20.886688 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:20.886662 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68018594-cd79-418d-92f7-ff2244ebff00-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "68018594-cd79-418d-92f7-ff2244ebff00" (UID: "68018594-cd79-418d-92f7-ff2244ebff00"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:19:20.973671 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:20.973641 2565 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/68018594-cd79-418d-92f7-ff2244ebff00-bound-sa-token\") on node \"ip-10-0-138-119.ec2.internal\" DevicePath \"\"" Apr 28 19:19:20.973788 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:20.973677 2565 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/68018594-cd79-418d-92f7-ff2244ebff00-ca-trust-extracted\") on node \"ip-10-0-138-119.ec2.internal\" DevicePath \"\"" Apr 28 19:19:20.973788 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:20.973692 2565 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/68018594-cd79-418d-92f7-ff2244ebff00-registry-certificates\") on node \"ip-10-0-138-119.ec2.internal\" DevicePath \"\"" Apr 28 19:19:20.973788 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:20.973707 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6dks7\" (UniqueName: \"kubernetes.io/projected/68018594-cd79-418d-92f7-ff2244ebff00-kube-api-access-6dks7\") on node \"ip-10-0-138-119.ec2.internal\" DevicePath \"\"" Apr 28 19:19:20.973788 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:20.973734 2565 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68018594-cd79-418d-92f7-ff2244ebff00-trusted-ca\") on node \"ip-10-0-138-119.ec2.internal\" DevicePath \"\"" Apr 28 19:19:20.973788 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:20.973749 2565 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/68018594-cd79-418d-92f7-ff2244ebff00-image-registry-private-configuration\") on node \"ip-10-0-138-119.ec2.internal\" DevicePath \"\"" Apr 28 19:19:20.973788 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:20.973763 2565 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/68018594-cd79-418d-92f7-ff2244ebff00-installation-pull-secrets\") on node \"ip-10-0-138-119.ec2.internal\" DevicePath \"\"" Apr 28 19:19:20.973788 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:20.973778 2565 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/68018594-cd79-418d-92f7-ff2244ebff00-registry-tls\") on node \"ip-10-0-138-119.ec2.internal\" DevicePath \"\"" Apr 28 19:19:21.767508 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:21.767473 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-x7t7n" event={"ID":"3a57a1a7-9bff-45c6-a3ab-4fdacd5a6787","Type":"ContainerStarted","Data":"25df2367618bee420e827ba6f408e6a4c1d67c0d68865b0cb3510d924990e0fa"} Apr 28 19:19:21.774043 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:21.774013 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dzfv2" event={"ID":"32912883-a580-4985-b771-5a3b8b2d6ab5","Type":"ContainerStarted","Data":"693f9664443effb57d4efb90f863cf60799e178b6f29ea61eb56399ad62cfd39"} Apr 28 19:19:21.775178 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:21.775156 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7c9bd88988-mljkv" event={"ID":"68018594-cd79-418d-92f7-ff2244ebff00","Type":"ContainerDied","Data":"c631d1609d7b31b340b0784c6faec4e8a6eca2fad439c59b1f6c3ac89466cf2d"} Apr 28 19:19:21.775325 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:21.775191 2565 scope.go:117] "RemoveContainer" containerID="9cd1815ae4b383e2dd1f7791f1940f0c757787daab5fdcc3ee54f5c36be1731c" Apr 28 19:19:21.775325 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:21.775219 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7c9bd88988-mljkv" Apr 28 19:19:21.950775 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:21.950743 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7c9bd88988-mljkv"] Apr 28 19:19:21.957788 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:21.957762 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-7c9bd88988-mljkv"] Apr 28 19:19:23.124874 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:23.124840 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68018594-cd79-418d-92f7-ff2244ebff00" path="/var/lib/kubelet/pods/68018594-cd79-418d-92f7-ff2244ebff00/volumes" Apr 28 19:19:55.038216 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:55.038182 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09653a58-e44e-4fb5-a021-58bc08a4765f-metrics-certs\") pod \"network-metrics-daemon-88gvq\" (UID: \"09653a58-e44e-4fb5-a021-58bc08a4765f\") " pod="openshift-multus/network-metrics-daemon-88gvq" Apr 28 19:19:55.040476 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:55.040453 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09653a58-e44e-4fb5-a021-58bc08a4765f-metrics-certs\") pod \"network-metrics-daemon-88gvq\" (UID: \"09653a58-e44e-4fb5-a021-58bc08a4765f\") " pod="openshift-multus/network-metrics-daemon-88gvq" Apr 28 19:19:55.324782 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:55.324718 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-djdhf\"" Apr 28 19:19:55.332694 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:55.332678 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-88gvq" Apr 28 19:19:55.454040 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:55.454007 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:55.456887 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:55.456861 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-88gvq"] Apr 28 19:19:55.460432 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:19:55.460407 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09653a58_e44e_4fb5_a021_58bc08a4765f.slice/crio-ad82ccee052a0c42d7b49fb6834338da375d8590e8138714a43c5bedb9ff10a2 WatchSource:0}: Error finding container ad82ccee052a0c42d7b49fb6834338da375d8590e8138714a43c5bedb9ff10a2: Status 404 returned error can't find the container with id ad82ccee052a0c42d7b49fb6834338da375d8590e8138714a43c5bedb9ff10a2 Apr 28 19:19:55.473038 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:55.473016 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:55.883635 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:55.883600 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-88gvq" event={"ID":"09653a58-e44e-4fb5-a021-58bc08a4765f","Type":"ContainerStarted","Data":"ad82ccee052a0c42d7b49fb6834338da375d8590e8138714a43c5bedb9ff10a2"} Apr 28 19:19:55.898921 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:55.898880 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:57.896964 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:57.896922 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-88gvq" event={"ID":"09653a58-e44e-4fb5-a021-58bc08a4765f","Type":"ContainerStarted","Data":"9af145a892283ef85692b762b288bb9579fd10388a70a91b93bad2ea21563e46"} Apr 28 19:19:57.896964 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:57.896966 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-88gvq" event={"ID":"09653a58-e44e-4fb5-a021-58bc08a4765f","Type":"ContainerStarted","Data":"ed6ccf4d208e86e9d283a42a1f7d0808cbc5c439d2293299702bfa156f92accf"} Apr 28 19:19:57.927067 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:19:57.927024 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-88gvq" podStartSLOduration=253.604211443 podStartE2EDuration="4m14.927009794s" podCreationTimestamp="2026-04-28 19:15:43 +0000 UTC" firstStartedPulling="2026-04-28 19:19:55.462253772 +0000 UTC m=+252.861998832" lastFinishedPulling="2026-04-28 19:19:56.785052113 +0000 UTC m=+254.184797183" observedRunningTime="2026-04-28 19:19:57.92601584 +0000 UTC m=+255.325760921" watchObservedRunningTime="2026-04-28 19:19:57.927009794 +0000 UTC m=+255.326754851" Apr 28 19:20:08.930494 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:08.930459 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-4h69b_8bf92c20-e551-497f-995e-ea716db91e5d/cluster-monitoring-operator/0.log" Apr 28 19:20:08.930929 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:08.930505 2565 generic.go:358] "Generic (PLEG): container finished" podID="8bf92c20-e551-497f-995e-ea716db91e5d" containerID="3326d65c3f442edc9ecde4d6d6152265a3f392228bd8f61a1d6bdd5f9f3997d2" exitCode=2 Apr 28 19:20:08.930929 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:08.930581 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4h69b" event={"ID":"8bf92c20-e551-497f-995e-ea716db91e5d","Type":"ContainerDied","Data":"3326d65c3f442edc9ecde4d6d6152265a3f392228bd8f61a1d6bdd5f9f3997d2"} Apr 28 19:20:08.930929 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:08.930920 2565 scope.go:117] "RemoveContainer" containerID="3326d65c3f442edc9ecde4d6d6152265a3f392228bd8f61a1d6bdd5f9f3997d2" Apr 28 19:20:09.935590 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:09.935563 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-4h69b_8bf92c20-e551-497f-995e-ea716db91e5d/cluster-monitoring-operator/0.log" Apr 28 19:20:09.935977 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:09.935653 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4h69b" event={"ID":"8bf92c20-e551-497f-995e-ea716db91e5d","Type":"ContainerStarted","Data":"63619a41e90bb4ee3afac9eaec43aaeb9eba26a58c46b9ad97722c6f3b78991b"} Apr 28 19:20:13.533919 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:13.533872 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 28 19:20:13.534333 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:13.534289 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="4ad1cd3f-cb65-4c84-82de-9ca7210a9f70" containerName="alertmanager" containerID="cri-o://adcac847e5bde97cde0abfa9a10f4c6c4c8c1e5e5940c7ac50d4f81ea6f6c76a" gracePeriod=120 Apr 28 19:20:13.534420 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:13.534339 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="4ad1cd3f-cb65-4c84-82de-9ca7210a9f70" containerName="kube-rbac-proxy-metric" containerID="cri-o://8dee541e627cdb5e7ed89e7bb767a29456f1686ac551e55900a252b88e25344f" gracePeriod=120 Apr 28 19:20:13.534484 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:13.534405 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="4ad1cd3f-cb65-4c84-82de-9ca7210a9f70" containerName="config-reloader" containerID="cri-o://3f3b999d0e56b911d6dadd13a588e79776b4d957ac0a814539379bd8c621c2e1" gracePeriod=120 Apr 28 19:20:13.534484 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:13.534403 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="4ad1cd3f-cb65-4c84-82de-9ca7210a9f70" containerName="kube-rbac-proxy-web" containerID="cri-o://d56627fcdf0aa95836a114393eab6ede902be4ce038f4e40756525bd8c1f5837" gracePeriod=120 Apr 28 19:20:13.534484 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:13.534446 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="4ad1cd3f-cb65-4c84-82de-9ca7210a9f70" containerName="kube-rbac-proxy" containerID="cri-o://143b853de2df31d8f9c6a26edadaf7e75a00553a93467699cacc2e2384677ad7" gracePeriod=120 Apr 28 19:20:13.534615 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:13.534487 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="4ad1cd3f-cb65-4c84-82de-9ca7210a9f70" containerName="prom-label-proxy" containerID="cri-o://bfe288c7b104fc36ef454a681e4431b7fa4e5b95760d5a64863be06c3caaf5a9" gracePeriod=120 Apr 28 19:20:13.952752 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:13.952678 2565 generic.go:358] "Generic (PLEG): container finished" podID="4ad1cd3f-cb65-4c84-82de-9ca7210a9f70" containerID="bfe288c7b104fc36ef454a681e4431b7fa4e5b95760d5a64863be06c3caaf5a9" exitCode=0 Apr 28 19:20:13.952752 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:13.952703 2565 generic.go:358] "Generic (PLEG): container finished" podID="4ad1cd3f-cb65-4c84-82de-9ca7210a9f70" containerID="143b853de2df31d8f9c6a26edadaf7e75a00553a93467699cacc2e2384677ad7" exitCode=0 Apr 28 19:20:13.952752 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:13.952710 2565 generic.go:358] "Generic (PLEG): container finished" podID="4ad1cd3f-cb65-4c84-82de-9ca7210a9f70" containerID="3f3b999d0e56b911d6dadd13a588e79776b4d957ac0a814539379bd8c621c2e1" exitCode=0 Apr 28 19:20:13.952752 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:13.952716 2565 generic.go:358] "Generic (PLEG): container finished" podID="4ad1cd3f-cb65-4c84-82de-9ca7210a9f70" containerID="adcac847e5bde97cde0abfa9a10f4c6c4c8c1e5e5940c7ac50d4f81ea6f6c76a" exitCode=0 Apr 28 19:20:13.953081 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:13.952752 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70","Type":"ContainerDied","Data":"bfe288c7b104fc36ef454a681e4431b7fa4e5b95760d5a64863be06c3caaf5a9"} Apr 28 19:20:13.953081 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:13.952790 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70","Type":"ContainerDied","Data":"143b853de2df31d8f9c6a26edadaf7e75a00553a93467699cacc2e2384677ad7"} Apr 28 19:20:13.953081 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:13.952801 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70","Type":"ContainerDied","Data":"3f3b999d0e56b911d6dadd13a588e79776b4d957ac0a814539379bd8c621c2e1"} Apr 28 19:20:13.953081 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:13.952811 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70","Type":"ContainerDied","Data":"adcac847e5bde97cde0abfa9a10f4c6c4c8c1e5e5940c7ac50d4f81ea6f6c76a"} Apr 28 19:20:14.773255 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.773233 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:20:14.802689 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.802664 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-secret-alertmanager-kube-rbac-proxy-metric\") pod \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " Apr 28 19:20:14.802772 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.802692 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-cluster-tls-config\") pod \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " Apr 28 19:20:14.802772 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.802717 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-secret-alertmanager-kube-rbac-proxy-web\") pod \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " Apr 28 19:20:14.802772 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.802736 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-alertmanager-main-db\") pod \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " Apr 28 19:20:14.802772 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.802751 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-web-config\") pod \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " Apr 28 19:20:14.802987 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.802783 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvvwq\" (UniqueName: \"kubernetes.io/projected/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-kube-api-access-qvvwq\") pod \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " Apr 28 19:20:14.802987 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.802821 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-secret-alertmanager-kube-rbac-proxy\") pod \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " Apr 28 19:20:14.802987 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.802838 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-config-out\") pod \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " Apr 28 19:20:14.802987 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.802883 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-alertmanager-trusted-ca-bundle\") pod \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " Apr 28 19:20:14.802987 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.802947 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-tls-assets\") pod \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " Apr 28 19:20:14.802987 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.802977 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-secret-alertmanager-main-tls\") pod \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " Apr 28 19:20:14.803272 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.803008 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-metrics-client-ca\") pod \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " Apr 28 19:20:14.803272 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.803034 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-config-volume\") pod \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\" (UID: \"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70\") " Apr 28 19:20:14.803272 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.803115 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "4ad1cd3f-cb65-4c84-82de-9ca7210a9f70" (UID: "4ad1cd3f-cb65-4c84-82de-9ca7210a9f70"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:20:14.803443 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.803351 2565 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-alertmanager-main-db\") on node \"ip-10-0-138-119.ec2.internal\" DevicePath \"\"" Apr 28 19:20:14.803970 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.803939 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "4ad1cd3f-cb65-4c84-82de-9ca7210a9f70" (UID: "4ad1cd3f-cb65-4c84-82de-9ca7210a9f70"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:20:14.804099 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.804072 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "4ad1cd3f-cb65-4c84-82de-9ca7210a9f70" (UID: "4ad1cd3f-cb65-4c84-82de-9ca7210a9f70"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:20:14.805811 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.805759 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "4ad1cd3f-cb65-4c84-82de-9ca7210a9f70" (UID: "4ad1cd3f-cb65-4c84-82de-9ca7210a9f70"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:20:14.806110 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.806058 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "4ad1cd3f-cb65-4c84-82de-9ca7210a9f70" (UID: "4ad1cd3f-cb65-4c84-82de-9ca7210a9f70"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:20:14.806377 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.806355 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "4ad1cd3f-cb65-4c84-82de-9ca7210a9f70" (UID: "4ad1cd3f-cb65-4c84-82de-9ca7210a9f70"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:20:14.807176 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.806942 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-kube-api-access-qvvwq" (OuterVolumeSpecName: "kube-api-access-qvvwq") pod "4ad1cd3f-cb65-4c84-82de-9ca7210a9f70" (UID: "4ad1cd3f-cb65-4c84-82de-9ca7210a9f70"). InnerVolumeSpecName "kube-api-access-qvvwq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:20:14.810706 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.807454 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "4ad1cd3f-cb65-4c84-82de-9ca7210a9f70" (UID: "4ad1cd3f-cb65-4c84-82de-9ca7210a9f70"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:20:14.810706 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.807888 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-config-volume" (OuterVolumeSpecName: "config-volume") pod "4ad1cd3f-cb65-4c84-82de-9ca7210a9f70" (UID: "4ad1cd3f-cb65-4c84-82de-9ca7210a9f70"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:20:14.810706 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.808101 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "4ad1cd3f-cb65-4c84-82de-9ca7210a9f70" (UID: "4ad1cd3f-cb65-4c84-82de-9ca7210a9f70"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:20:14.810706 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.808366 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-config-out" (OuterVolumeSpecName: "config-out") pod "4ad1cd3f-cb65-4c84-82de-9ca7210a9f70" (UID: "4ad1cd3f-cb65-4c84-82de-9ca7210a9f70"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:20:14.822425 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.822387 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "4ad1cd3f-cb65-4c84-82de-9ca7210a9f70" (UID: "4ad1cd3f-cb65-4c84-82de-9ca7210a9f70"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:20:14.826861 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.826835 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-web-config" (OuterVolumeSpecName: "web-config") pod "4ad1cd3f-cb65-4c84-82de-9ca7210a9f70" (UID: "4ad1cd3f-cb65-4c84-82de-9ca7210a9f70"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:20:14.904036 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.903978 2565 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-138-119.ec2.internal\" DevicePath \"\"" Apr 28 19:20:14.904036 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.903999 2565 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-cluster-tls-config\") on node \"ip-10-0-138-119.ec2.internal\" DevicePath \"\"" Apr 28 19:20:14.904036 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.904009 2565 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-138-119.ec2.internal\" DevicePath \"\"" Apr 28 19:20:14.904036 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.904019 2565 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-web-config\") on node \"ip-10-0-138-119.ec2.internal\" DevicePath \"\"" Apr 28 19:20:14.904036 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.904028 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qvvwq\" (UniqueName: \"kubernetes.io/projected/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-kube-api-access-qvvwq\") on node \"ip-10-0-138-119.ec2.internal\" DevicePath \"\"" Apr 28 19:20:14.904036 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.904038 2565 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-138-119.ec2.internal\" DevicePath \"\"" Apr 28 19:20:14.904265 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.904047 2565 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-config-out\") on node \"ip-10-0-138-119.ec2.internal\" DevicePath \"\"" Apr 28 19:20:14.904265 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.904056 2565 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-138-119.ec2.internal\" DevicePath \"\"" Apr 28 19:20:14.904265 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.904064 2565 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-tls-assets\") on node \"ip-10-0-138-119.ec2.internal\" DevicePath \"\"" Apr 28 19:20:14.904265 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.904073 2565 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-secret-alertmanager-main-tls\") on node \"ip-10-0-138-119.ec2.internal\" DevicePath \"\"" Apr 28 19:20:14.904265 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.904082 2565 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-metrics-client-ca\") on node \"ip-10-0-138-119.ec2.internal\" DevicePath \"\"" Apr 28 19:20:14.904265 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.904090 2565 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70-config-volume\") on node \"ip-10-0-138-119.ec2.internal\" DevicePath \"\"" Apr 28 19:20:14.959377 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.959351 2565 generic.go:358] "Generic (PLEG): container finished" podID="4ad1cd3f-cb65-4c84-82de-9ca7210a9f70" containerID="8dee541e627cdb5e7ed89e7bb767a29456f1686ac551e55900a252b88e25344f" exitCode=0 Apr 28 19:20:14.959377 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.959375 2565 generic.go:358] "Generic (PLEG): container finished" podID="4ad1cd3f-cb65-4c84-82de-9ca7210a9f70" containerID="d56627fcdf0aa95836a114393eab6ede902be4ce038f4e40756525bd8c1f5837" exitCode=0 Apr 28 19:20:14.959506 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.959394 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70","Type":"ContainerDied","Data":"8dee541e627cdb5e7ed89e7bb767a29456f1686ac551e55900a252b88e25344f"} Apr 28 19:20:14.959506 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.959415 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70","Type":"ContainerDied","Data":"d56627fcdf0aa95836a114393eab6ede902be4ce038f4e40756525bd8c1f5837"} Apr 28 19:20:14.959506 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.959425 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4ad1cd3f-cb65-4c84-82de-9ca7210a9f70","Type":"ContainerDied","Data":"763ea0797fb788995c2b60d78b8e8fb4543106b01b25dd1cddd86422bf4570e3"} Apr 28 19:20:14.959506 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.959438 2565 scope.go:117] "RemoveContainer" containerID="bfe288c7b104fc36ef454a681e4431b7fa4e5b95760d5a64863be06c3caaf5a9" Apr 28 19:20:14.959506 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.959473 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:20:14.967467 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.967406 2565 scope.go:117] "RemoveContainer" containerID="8dee541e627cdb5e7ed89e7bb767a29456f1686ac551e55900a252b88e25344f" Apr 28 19:20:14.973955 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.973938 2565 scope.go:117] "RemoveContainer" containerID="143b853de2df31d8f9c6a26edadaf7e75a00553a93467699cacc2e2384677ad7" Apr 28 19:20:14.980356 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.980341 2565 scope.go:117] "RemoveContainer" containerID="d56627fcdf0aa95836a114393eab6ede902be4ce038f4e40756525bd8c1f5837" Apr 28 19:20:14.983572 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.983551 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 28 19:20:14.987473 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.987439 2565 scope.go:117] "RemoveContainer" containerID="3f3b999d0e56b911d6dadd13a588e79776b4d957ac0a814539379bd8c621c2e1" Apr 28 19:20:14.988604 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.988586 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 28 19:20:14.993948 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:14.993927 2565 scope.go:117] "RemoveContainer" containerID="adcac847e5bde97cde0abfa9a10f4c6c4c8c1e5e5940c7ac50d4f81ea6f6c76a" Apr 28 19:20:15.000318 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.000303 2565 scope.go:117] "RemoveContainer" containerID="62ef78268d287d365a910ca3ce7e20524522d2d89ba2907d7e78cc7cd6287094" Apr 28 19:20:15.006667 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.006649 2565 scope.go:117] "RemoveContainer" containerID="bfe288c7b104fc36ef454a681e4431b7fa4e5b95760d5a64863be06c3caaf5a9" Apr 28 19:20:15.006941 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:20:15.006921 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfe288c7b104fc36ef454a681e4431b7fa4e5b95760d5a64863be06c3caaf5a9\": container with ID starting with bfe288c7b104fc36ef454a681e4431b7fa4e5b95760d5a64863be06c3caaf5a9 not found: ID does not exist" containerID="bfe288c7b104fc36ef454a681e4431b7fa4e5b95760d5a64863be06c3caaf5a9" Apr 28 19:20:15.006994 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.006948 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfe288c7b104fc36ef454a681e4431b7fa4e5b95760d5a64863be06c3caaf5a9"} err="failed to get container status \"bfe288c7b104fc36ef454a681e4431b7fa4e5b95760d5a64863be06c3caaf5a9\": rpc error: code = NotFound desc = could not find container \"bfe288c7b104fc36ef454a681e4431b7fa4e5b95760d5a64863be06c3caaf5a9\": container with ID starting with bfe288c7b104fc36ef454a681e4431b7fa4e5b95760d5a64863be06c3caaf5a9 not found: ID does not exist" Apr 28 19:20:15.006994 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.006984 2565 scope.go:117] "RemoveContainer" containerID="8dee541e627cdb5e7ed89e7bb767a29456f1686ac551e55900a252b88e25344f" Apr 28 19:20:15.007236 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:20:15.007218 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8dee541e627cdb5e7ed89e7bb767a29456f1686ac551e55900a252b88e25344f\": container with ID starting with 8dee541e627cdb5e7ed89e7bb767a29456f1686ac551e55900a252b88e25344f not found: ID does not exist" containerID="8dee541e627cdb5e7ed89e7bb767a29456f1686ac551e55900a252b88e25344f" Apr 28 19:20:15.007278 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.007249 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dee541e627cdb5e7ed89e7bb767a29456f1686ac551e55900a252b88e25344f"} err="failed to get container status \"8dee541e627cdb5e7ed89e7bb767a29456f1686ac551e55900a252b88e25344f\": rpc error: code = NotFound desc = could not find container \"8dee541e627cdb5e7ed89e7bb767a29456f1686ac551e55900a252b88e25344f\": container with ID starting with 8dee541e627cdb5e7ed89e7bb767a29456f1686ac551e55900a252b88e25344f not found: ID does not exist" Apr 28 19:20:15.007278 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.007266 2565 scope.go:117] "RemoveContainer" containerID="143b853de2df31d8f9c6a26edadaf7e75a00553a93467699cacc2e2384677ad7" Apr 28 19:20:15.007501 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:20:15.007484 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"143b853de2df31d8f9c6a26edadaf7e75a00553a93467699cacc2e2384677ad7\": container with ID starting with 143b853de2df31d8f9c6a26edadaf7e75a00553a93467699cacc2e2384677ad7 not found: ID does not exist" containerID="143b853de2df31d8f9c6a26edadaf7e75a00553a93467699cacc2e2384677ad7" Apr 28 19:20:15.007581 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.007511 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"143b853de2df31d8f9c6a26edadaf7e75a00553a93467699cacc2e2384677ad7"} err="failed to get container status \"143b853de2df31d8f9c6a26edadaf7e75a00553a93467699cacc2e2384677ad7\": rpc error: code = NotFound desc = could not find container \"143b853de2df31d8f9c6a26edadaf7e75a00553a93467699cacc2e2384677ad7\": container with ID starting with 143b853de2df31d8f9c6a26edadaf7e75a00553a93467699cacc2e2384677ad7 not found: ID does not exist" Apr 28 19:20:15.007581 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.007536 2565 scope.go:117] "RemoveContainer" containerID="d56627fcdf0aa95836a114393eab6ede902be4ce038f4e40756525bd8c1f5837" Apr 28 19:20:15.007779 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:20:15.007761 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d56627fcdf0aa95836a114393eab6ede902be4ce038f4e40756525bd8c1f5837\": container with ID starting with d56627fcdf0aa95836a114393eab6ede902be4ce038f4e40756525bd8c1f5837 not found: ID does not exist" containerID="d56627fcdf0aa95836a114393eab6ede902be4ce038f4e40756525bd8c1f5837" Apr 28 19:20:15.007859 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.007784 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d56627fcdf0aa95836a114393eab6ede902be4ce038f4e40756525bd8c1f5837"} err="failed to get container status \"d56627fcdf0aa95836a114393eab6ede902be4ce038f4e40756525bd8c1f5837\": rpc error: code = NotFound desc = could not find container \"d56627fcdf0aa95836a114393eab6ede902be4ce038f4e40756525bd8c1f5837\": container with ID starting with d56627fcdf0aa95836a114393eab6ede902be4ce038f4e40756525bd8c1f5837 not found: ID does not exist" Apr 28 19:20:15.007859 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.007801 2565 scope.go:117] "RemoveContainer" containerID="3f3b999d0e56b911d6dadd13a588e79776b4d957ac0a814539379bd8c621c2e1" Apr 28 19:20:15.008109 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:20:15.008075 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f3b999d0e56b911d6dadd13a588e79776b4d957ac0a814539379bd8c621c2e1\": container with ID starting with 3f3b999d0e56b911d6dadd13a588e79776b4d957ac0a814539379bd8c621c2e1 not found: ID does not exist" containerID="3f3b999d0e56b911d6dadd13a588e79776b4d957ac0a814539379bd8c621c2e1" Apr 28 19:20:15.008211 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.008114 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f3b999d0e56b911d6dadd13a588e79776b4d957ac0a814539379bd8c621c2e1"} err="failed to get container status \"3f3b999d0e56b911d6dadd13a588e79776b4d957ac0a814539379bd8c621c2e1\": rpc error: code = NotFound desc = could not find container \"3f3b999d0e56b911d6dadd13a588e79776b4d957ac0a814539379bd8c621c2e1\": container with ID starting with 3f3b999d0e56b911d6dadd13a588e79776b4d957ac0a814539379bd8c621c2e1 not found: ID does not exist" Apr 28 19:20:15.008211 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.008134 2565 scope.go:117] "RemoveContainer" containerID="adcac847e5bde97cde0abfa9a10f4c6c4c8c1e5e5940c7ac50d4f81ea6f6c76a" Apr 28 19:20:15.008453 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:20:15.008427 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adcac847e5bde97cde0abfa9a10f4c6c4c8c1e5e5940c7ac50d4f81ea6f6c76a\": container with ID starting with adcac847e5bde97cde0abfa9a10f4c6c4c8c1e5e5940c7ac50d4f81ea6f6c76a not found: ID does not exist" containerID="adcac847e5bde97cde0abfa9a10f4c6c4c8c1e5e5940c7ac50d4f81ea6f6c76a" Apr 28 19:20:15.008536 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.008457 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adcac847e5bde97cde0abfa9a10f4c6c4c8c1e5e5940c7ac50d4f81ea6f6c76a"} err="failed to get container status \"adcac847e5bde97cde0abfa9a10f4c6c4c8c1e5e5940c7ac50d4f81ea6f6c76a\": rpc error: code = NotFound desc = could not find container \"adcac847e5bde97cde0abfa9a10f4c6c4c8c1e5e5940c7ac50d4f81ea6f6c76a\": container with ID starting with adcac847e5bde97cde0abfa9a10f4c6c4c8c1e5e5940c7ac50d4f81ea6f6c76a not found: ID does not exist" Apr 28 19:20:15.008536 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.008470 2565 scope.go:117] "RemoveContainer" containerID="62ef78268d287d365a910ca3ce7e20524522d2d89ba2907d7e78cc7cd6287094" Apr 28 19:20:15.008671 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:20:15.008655 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62ef78268d287d365a910ca3ce7e20524522d2d89ba2907d7e78cc7cd6287094\": container with ID starting with 62ef78268d287d365a910ca3ce7e20524522d2d89ba2907d7e78cc7cd6287094 not found: ID does not exist" containerID="62ef78268d287d365a910ca3ce7e20524522d2d89ba2907d7e78cc7cd6287094" Apr 28 19:20:15.008713 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.008677 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62ef78268d287d365a910ca3ce7e20524522d2d89ba2907d7e78cc7cd6287094"} err="failed to get container status \"62ef78268d287d365a910ca3ce7e20524522d2d89ba2907d7e78cc7cd6287094\": rpc error: code = NotFound desc = could not find container \"62ef78268d287d365a910ca3ce7e20524522d2d89ba2907d7e78cc7cd6287094\": container with ID starting with 62ef78268d287d365a910ca3ce7e20524522d2d89ba2907d7e78cc7cd6287094 not found: ID does not exist" Apr 28 19:20:15.008713 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.008690 2565 scope.go:117] "RemoveContainer" containerID="bfe288c7b104fc36ef454a681e4431b7fa4e5b95760d5a64863be06c3caaf5a9" Apr 28 19:20:15.008916 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.008882 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfe288c7b104fc36ef454a681e4431b7fa4e5b95760d5a64863be06c3caaf5a9"} err="failed to get container status \"bfe288c7b104fc36ef454a681e4431b7fa4e5b95760d5a64863be06c3caaf5a9\": rpc error: code = NotFound desc = could not find container \"bfe288c7b104fc36ef454a681e4431b7fa4e5b95760d5a64863be06c3caaf5a9\": container with ID starting with bfe288c7b104fc36ef454a681e4431b7fa4e5b95760d5a64863be06c3caaf5a9 not found: ID does not exist" Apr 28 19:20:15.008981 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.008970 2565 scope.go:117] "RemoveContainer" containerID="8dee541e627cdb5e7ed89e7bb767a29456f1686ac551e55900a252b88e25344f" Apr 28 19:20:15.009193 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.009172 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dee541e627cdb5e7ed89e7bb767a29456f1686ac551e55900a252b88e25344f"} err="failed to get container status \"8dee541e627cdb5e7ed89e7bb767a29456f1686ac551e55900a252b88e25344f\": rpc error: code = NotFound desc = could not find container \"8dee541e627cdb5e7ed89e7bb767a29456f1686ac551e55900a252b88e25344f\": container with ID starting with 8dee541e627cdb5e7ed89e7bb767a29456f1686ac551e55900a252b88e25344f not found: ID does not exist" Apr 28 19:20:15.009271 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.009195 2565 scope.go:117] "RemoveContainer" containerID="143b853de2df31d8f9c6a26edadaf7e75a00553a93467699cacc2e2384677ad7" Apr 28 19:20:15.009396 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.009381 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"143b853de2df31d8f9c6a26edadaf7e75a00553a93467699cacc2e2384677ad7"} err="failed to get container status \"143b853de2df31d8f9c6a26edadaf7e75a00553a93467699cacc2e2384677ad7\": rpc error: code = NotFound desc = could not find container \"143b853de2df31d8f9c6a26edadaf7e75a00553a93467699cacc2e2384677ad7\": container with ID starting with 143b853de2df31d8f9c6a26edadaf7e75a00553a93467699cacc2e2384677ad7 not found: ID does not exist" Apr 28 19:20:15.009441 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.009398 2565 scope.go:117] "RemoveContainer" containerID="d56627fcdf0aa95836a114393eab6ede902be4ce038f4e40756525bd8c1f5837" Apr 28 19:20:15.009607 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.009589 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d56627fcdf0aa95836a114393eab6ede902be4ce038f4e40756525bd8c1f5837"} err="failed to get container status \"d56627fcdf0aa95836a114393eab6ede902be4ce038f4e40756525bd8c1f5837\": rpc error: code = NotFound desc = could not find container \"d56627fcdf0aa95836a114393eab6ede902be4ce038f4e40756525bd8c1f5837\": container with ID starting with d56627fcdf0aa95836a114393eab6ede902be4ce038f4e40756525bd8c1f5837 not found: ID does not exist" Apr 28 19:20:15.009671 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.009608 2565 scope.go:117] "RemoveContainer" containerID="3f3b999d0e56b911d6dadd13a588e79776b4d957ac0a814539379bd8c621c2e1" Apr 28 19:20:15.009810 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.009791 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f3b999d0e56b911d6dadd13a588e79776b4d957ac0a814539379bd8c621c2e1"} err="failed to get container status \"3f3b999d0e56b911d6dadd13a588e79776b4d957ac0a814539379bd8c621c2e1\": rpc error: code = NotFound desc = could not find container \"3f3b999d0e56b911d6dadd13a588e79776b4d957ac0a814539379bd8c621c2e1\": container with ID starting with 3f3b999d0e56b911d6dadd13a588e79776b4d957ac0a814539379bd8c621c2e1 not found: ID does not exist" Apr 28 19:20:15.009874 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.009811 2565 scope.go:117] "RemoveContainer" containerID="adcac847e5bde97cde0abfa9a10f4c6c4c8c1e5e5940c7ac50d4f81ea6f6c76a" Apr 28 19:20:15.010026 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.010009 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adcac847e5bde97cde0abfa9a10f4c6c4c8c1e5e5940c7ac50d4f81ea6f6c76a"} err="failed to get container status \"adcac847e5bde97cde0abfa9a10f4c6c4c8c1e5e5940c7ac50d4f81ea6f6c76a\": rpc error: code = NotFound desc = could not find container \"adcac847e5bde97cde0abfa9a10f4c6c4c8c1e5e5940c7ac50d4f81ea6f6c76a\": container with ID starting with adcac847e5bde97cde0abfa9a10f4c6c4c8c1e5e5940c7ac50d4f81ea6f6c76a not found: ID does not exist" Apr 28 19:20:15.010095 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.010026 2565 scope.go:117] "RemoveContainer" containerID="62ef78268d287d365a910ca3ce7e20524522d2d89ba2907d7e78cc7cd6287094" Apr 28 19:20:15.010237 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.010221 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62ef78268d287d365a910ca3ce7e20524522d2d89ba2907d7e78cc7cd6287094"} err="failed to get container status \"62ef78268d287d365a910ca3ce7e20524522d2d89ba2907d7e78cc7cd6287094\": rpc error: code = NotFound desc = could not find container \"62ef78268d287d365a910ca3ce7e20524522d2d89ba2907d7e78cc7cd6287094\": container with ID starting with 62ef78268d287d365a910ca3ce7e20524522d2d89ba2907d7e78cc7cd6287094 not found: ID does not exist" Apr 28 19:20:15.017454 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.017436 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 28 19:20:15.017765 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.017753 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ad1cd3f-cb65-4c84-82de-9ca7210a9f70" containerName="alertmanager" Apr 28 19:20:15.017807 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.017769 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ad1cd3f-cb65-4c84-82de-9ca7210a9f70" containerName="alertmanager" Apr 28 19:20:15.017807 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.017785 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ad1cd3f-cb65-4c84-82de-9ca7210a9f70" containerName="config-reloader" Apr 28 19:20:15.017807 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.017791 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ad1cd3f-cb65-4c84-82de-9ca7210a9f70" containerName="config-reloader" Apr 28 19:20:15.017807 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.017803 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="68018594-cd79-418d-92f7-ff2244ebff00" containerName="registry" Apr 28 19:20:15.017807 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.017808 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="68018594-cd79-418d-92f7-ff2244ebff00" containerName="registry" Apr 28 19:20:15.017986 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.017813 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ad1cd3f-cb65-4c84-82de-9ca7210a9f70" containerName="kube-rbac-proxy-metric" Apr 28 19:20:15.017986 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.017819 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ad1cd3f-cb65-4c84-82de-9ca7210a9f70" containerName="kube-rbac-proxy-metric" Apr 28 19:20:15.017986 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.017828 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ad1cd3f-cb65-4c84-82de-9ca7210a9f70" containerName="prom-label-proxy" Apr 28 19:20:15.017986 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.017833 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ad1cd3f-cb65-4c84-82de-9ca7210a9f70" containerName="prom-label-proxy" Apr 28 19:20:15.017986 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.017845 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ad1cd3f-cb65-4c84-82de-9ca7210a9f70" containerName="init-config-reloader" Apr 28 19:20:15.017986 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.017853 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ad1cd3f-cb65-4c84-82de-9ca7210a9f70" containerName="init-config-reloader" Apr 28 19:20:15.017986 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.017859 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ad1cd3f-cb65-4c84-82de-9ca7210a9f70" containerName="kube-rbac-proxy-web" Apr 28 19:20:15.017986 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.017864 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ad1cd3f-cb65-4c84-82de-9ca7210a9f70" containerName="kube-rbac-proxy-web" Apr 28 19:20:15.017986 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.017870 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ad1cd3f-cb65-4c84-82de-9ca7210a9f70" containerName="kube-rbac-proxy" Apr 28 19:20:15.017986 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.017875 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ad1cd3f-cb65-4c84-82de-9ca7210a9f70" containerName="kube-rbac-proxy" Apr 28 19:20:15.017986 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.017938 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ad1cd3f-cb65-4c84-82de-9ca7210a9f70" containerName="config-reloader" Apr 28 19:20:15.017986 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.017948 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ad1cd3f-cb65-4c84-82de-9ca7210a9f70" containerName="kube-rbac-proxy-web" Apr 28 19:20:15.017986 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.017954 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ad1cd3f-cb65-4c84-82de-9ca7210a9f70" containerName="alertmanager" Apr 28 19:20:15.017986 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.017961 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ad1cd3f-cb65-4c84-82de-9ca7210a9f70" containerName="prom-label-proxy" Apr 28 19:20:15.017986 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.017968 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ad1cd3f-cb65-4c84-82de-9ca7210a9f70" containerName="kube-rbac-proxy-metric" Apr 28 19:20:15.017986 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.017974 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ad1cd3f-cb65-4c84-82de-9ca7210a9f70" containerName="kube-rbac-proxy" Apr 28 19:20:15.017986 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.017982 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="68018594-cd79-418d-92f7-ff2244ebff00" containerName="registry" Apr 28 19:20:15.022835 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.022821 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:20:15.029330 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.029311 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 28 19:20:15.029419 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.029383 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 28 19:20:15.029481 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.029433 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 28 19:20:15.029642 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.029621 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 28 19:20:15.029747 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.029733 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 28 19:20:15.029842 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.029825 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 28 19:20:15.029924 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.029827 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 28 19:20:15.029974 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.029947 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 28 19:20:15.030037 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.029999 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-8sm7b\"" Apr 28 19:20:15.035186 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.035167 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 28 19:20:15.038397 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.038364 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 28 19:20:15.105069 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.105049 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/cde45d40-ca73-4c9c-a0c1-eed42798f767-config-volume\") pod \"alertmanager-main-0\" (UID: \"cde45d40-ca73-4c9c-a0c1-eed42798f767\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:20:15.105166 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.105087 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/cde45d40-ca73-4c9c-a0c1-eed42798f767-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"cde45d40-ca73-4c9c-a0c1-eed42798f767\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:20:15.105166 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.105113 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cde45d40-ca73-4c9c-a0c1-eed42798f767-tls-assets\") pod \"alertmanager-main-0\" (UID: \"cde45d40-ca73-4c9c-a0c1-eed42798f767\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:20:15.105233 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.105153 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/cde45d40-ca73-4c9c-a0c1-eed42798f767-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"cde45d40-ca73-4c9c-a0c1-eed42798f767\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:20:15.105233 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.105199 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/cde45d40-ca73-4c9c-a0c1-eed42798f767-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"cde45d40-ca73-4c9c-a0c1-eed42798f767\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:20:15.105295 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.105241 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cde45d40-ca73-4c9c-a0c1-eed42798f767-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"cde45d40-ca73-4c9c-a0c1-eed42798f767\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:20:15.105295 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.105279 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cde45d40-ca73-4c9c-a0c1-eed42798f767-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"cde45d40-ca73-4c9c-a0c1-eed42798f767\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:20:15.105358 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.105300 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cde45d40-ca73-4c9c-a0c1-eed42798f767-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"cde45d40-ca73-4c9c-a0c1-eed42798f767\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:20:15.105358 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.105319 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cde45d40-ca73-4c9c-a0c1-eed42798f767-web-config\") pod \"alertmanager-main-0\" (UID: \"cde45d40-ca73-4c9c-a0c1-eed42798f767\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:20:15.105415 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.105358 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/cde45d40-ca73-4c9c-a0c1-eed42798f767-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"cde45d40-ca73-4c9c-a0c1-eed42798f767\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:20:15.105415 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.105383 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd6px\" (UniqueName: \"kubernetes.io/projected/cde45d40-ca73-4c9c-a0c1-eed42798f767-kube-api-access-wd6px\") pod \"alertmanager-main-0\" (UID: \"cde45d40-ca73-4c9c-a0c1-eed42798f767\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:20:15.105415 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.105405 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cde45d40-ca73-4c9c-a0c1-eed42798f767-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"cde45d40-ca73-4c9c-a0c1-eed42798f767\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:20:15.105505 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.105457 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cde45d40-ca73-4c9c-a0c1-eed42798f767-config-out\") pod \"alertmanager-main-0\" (UID: \"cde45d40-ca73-4c9c-a0c1-eed42798f767\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:20:15.123987 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.123965 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ad1cd3f-cb65-4c84-82de-9ca7210a9f70" path="/var/lib/kubelet/pods/4ad1cd3f-cb65-4c84-82de-9ca7210a9f70/volumes" Apr 28 19:20:15.206612 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.206563 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/cde45d40-ca73-4c9c-a0c1-eed42798f767-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"cde45d40-ca73-4c9c-a0c1-eed42798f767\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:20:15.206612 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.206591 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cde45d40-ca73-4c9c-a0c1-eed42798f767-tls-assets\") pod \"alertmanager-main-0\" (UID: \"cde45d40-ca73-4c9c-a0c1-eed42798f767\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:20:15.206612 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.206607 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/cde45d40-ca73-4c9c-a0c1-eed42798f767-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"cde45d40-ca73-4c9c-a0c1-eed42798f767\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:20:15.206806 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.206631 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/cde45d40-ca73-4c9c-a0c1-eed42798f767-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"cde45d40-ca73-4c9c-a0c1-eed42798f767\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:20:15.206806 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.206659 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cde45d40-ca73-4c9c-a0c1-eed42798f767-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"cde45d40-ca73-4c9c-a0c1-eed42798f767\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:20:15.206806 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.206765 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cde45d40-ca73-4c9c-a0c1-eed42798f767-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"cde45d40-ca73-4c9c-a0c1-eed42798f767\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:20:15.206806 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.206792 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cde45d40-ca73-4c9c-a0c1-eed42798f767-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"cde45d40-ca73-4c9c-a0c1-eed42798f767\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:20:15.207023 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.206819 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cde45d40-ca73-4c9c-a0c1-eed42798f767-web-config\") pod \"alertmanager-main-0\" (UID: \"cde45d40-ca73-4c9c-a0c1-eed42798f767\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:20:15.207023 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.206846 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/cde45d40-ca73-4c9c-a0c1-eed42798f767-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"cde45d40-ca73-4c9c-a0c1-eed42798f767\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:20:15.207023 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.206873 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wd6px\" (UniqueName: \"kubernetes.io/projected/cde45d40-ca73-4c9c-a0c1-eed42798f767-kube-api-access-wd6px\") pod \"alertmanager-main-0\" (UID: \"cde45d40-ca73-4c9c-a0c1-eed42798f767\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:20:15.207023 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.206934 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cde45d40-ca73-4c9c-a0c1-eed42798f767-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"cde45d40-ca73-4c9c-a0c1-eed42798f767\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:20:15.207023 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.206985 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cde45d40-ca73-4c9c-a0c1-eed42798f767-config-out\") pod \"alertmanager-main-0\" (UID: \"cde45d40-ca73-4c9c-a0c1-eed42798f767\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:20:15.207276 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.207023 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/cde45d40-ca73-4c9c-a0c1-eed42798f767-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"cde45d40-ca73-4c9c-a0c1-eed42798f767\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:20:15.207276 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.207037 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/cde45d40-ca73-4c9c-a0c1-eed42798f767-config-volume\") pod \"alertmanager-main-0\" (UID: \"cde45d40-ca73-4c9c-a0c1-eed42798f767\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:20:15.208350 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.208195 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cde45d40-ca73-4c9c-a0c1-eed42798f767-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"cde45d40-ca73-4c9c-a0c1-eed42798f767\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:20:15.208350 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.208293 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cde45d40-ca73-4c9c-a0c1-eed42798f767-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"cde45d40-ca73-4c9c-a0c1-eed42798f767\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:20:15.209836 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.209807 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/cde45d40-ca73-4c9c-a0c1-eed42798f767-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"cde45d40-ca73-4c9c-a0c1-eed42798f767\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:20:15.209983 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.209874 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cde45d40-ca73-4c9c-a0c1-eed42798f767-tls-assets\") pod \"alertmanager-main-0\" (UID: \"cde45d40-ca73-4c9c-a0c1-eed42798f767\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:20:15.209983 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.209885 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cde45d40-ca73-4c9c-a0c1-eed42798f767-web-config\") pod \"alertmanager-main-0\" (UID: \"cde45d40-ca73-4c9c-a0c1-eed42798f767\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:20:15.210271 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.210247 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cde45d40-ca73-4c9c-a0c1-eed42798f767-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"cde45d40-ca73-4c9c-a0c1-eed42798f767\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:20:15.210271 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.210259 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/cde45d40-ca73-4c9c-a0c1-eed42798f767-config-volume\") pod \"alertmanager-main-0\" (UID: \"cde45d40-ca73-4c9c-a0c1-eed42798f767\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:20:15.210377 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.210305 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/cde45d40-ca73-4c9c-a0c1-eed42798f767-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"cde45d40-ca73-4c9c-a0c1-eed42798f767\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:20:15.210770 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.210749 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cde45d40-ca73-4c9c-a0c1-eed42798f767-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"cde45d40-ca73-4c9c-a0c1-eed42798f767\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:20:15.210850 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.210832 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cde45d40-ca73-4c9c-a0c1-eed42798f767-config-out\") pod \"alertmanager-main-0\" (UID: \"cde45d40-ca73-4c9c-a0c1-eed42798f767\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:20:15.211116 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.211096 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/cde45d40-ca73-4c9c-a0c1-eed42798f767-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"cde45d40-ca73-4c9c-a0c1-eed42798f767\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:20:15.218559 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.218538 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd6px\" (UniqueName: \"kubernetes.io/projected/cde45d40-ca73-4c9c-a0c1-eed42798f767-kube-api-access-wd6px\") pod \"alertmanager-main-0\" (UID: \"cde45d40-ca73-4c9c-a0c1-eed42798f767\") " pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:20:15.332111 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.332089 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 28 19:20:15.676954 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.676929 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 28 19:20:15.679144 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:20:15.679118 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcde45d40_ca73_4c9c_a0c1_eed42798f767.slice/crio-a1ee3f23a3f2d01377663cdb99f3cedb6e547eafb9601897f8927b6566f3775c WatchSource:0}: Error finding container a1ee3f23a3f2d01377663cdb99f3cedb6e547eafb9601897f8927b6566f3775c: Status 404 returned error can't find the container with id a1ee3f23a3f2d01377663cdb99f3cedb6e547eafb9601897f8927b6566f3775c Apr 28 19:20:15.964708 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.964632 2565 generic.go:358] "Generic (PLEG): container finished" podID="cde45d40-ca73-4c9c-a0c1-eed42798f767" containerID="36728175d775effac51b1cc0b7d6b6ba71497088696324e0a23b62faa13dabf6" exitCode=0 Apr 28 19:20:15.965120 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.964722 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cde45d40-ca73-4c9c-a0c1-eed42798f767","Type":"ContainerDied","Data":"36728175d775effac51b1cc0b7d6b6ba71497088696324e0a23b62faa13dabf6"} Apr 28 19:20:15.965120 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:15.964760 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cde45d40-ca73-4c9c-a0c1-eed42798f767","Type":"ContainerStarted","Data":"a1ee3f23a3f2d01377663cdb99f3cedb6e547eafb9601897f8927b6566f3775c"} Apr 28 19:20:16.970990 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:16.970913 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cde45d40-ca73-4c9c-a0c1-eed42798f767","Type":"ContainerStarted","Data":"c31c79a5cf0f013be9d96f809aac88d673106ad1b568347a81fa4ea46cde0e3e"} Apr 28 19:20:16.970990 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:16.970951 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cde45d40-ca73-4c9c-a0c1-eed42798f767","Type":"ContainerStarted","Data":"d31e2e413c1b80c6045457e368d19b9602783ba8fda2e1e02f27d41e001af423"} Apr 28 19:20:16.970990 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:16.970961 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cde45d40-ca73-4c9c-a0c1-eed42798f767","Type":"ContainerStarted","Data":"997d0b7a016dc0bdb6a06372a135f912eb2e871efa302620006450d2242de455"} Apr 28 19:20:16.970990 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:16.970970 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cde45d40-ca73-4c9c-a0c1-eed42798f767","Type":"ContainerStarted","Data":"ed1b8d424ff8054a9b397d109cce9416cbe6d623c707ac70e9833c0b8f58d8a4"} Apr 28 19:20:16.970990 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:16.970979 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cde45d40-ca73-4c9c-a0c1-eed42798f767","Type":"ContainerStarted","Data":"80a9041ffb67ce52beb938e4d4e7287eb94a2d3d1dea5570696cea90f9a0e1e5"} Apr 28 19:20:16.970990 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:16.970987 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cde45d40-ca73-4c9c-a0c1-eed42798f767","Type":"ContainerStarted","Data":"e788d1f0757a69004587b35f7bc37d2dc6c51aef47ad453b94a798c73d703338"} Apr 28 19:20:17.009972 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:17.009921 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=3.009889318 podStartE2EDuration="3.009889318s" podCreationTimestamp="2026-04-28 19:20:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:20:17.008139386 +0000 UTC m=+274.407884464" watchObservedRunningTime="2026-04-28 19:20:17.009889318 +0000 UTC m=+274.409634430" Apr 28 19:20:17.901077 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:17.901041 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 28 19:20:17.901542 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:17.901447 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="cda13ef8-dc6e-4c55-9449-4617c04114b5" containerName="prometheus" containerID="cri-o://280bce37e65283ecc036af4224a36cf2a57fe5fc2535712764d6b23a03718791" gracePeriod=600 Apr 28 19:20:17.901542 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:17.901505 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="cda13ef8-dc6e-4c55-9449-4617c04114b5" containerName="kube-rbac-proxy-thanos" containerID="cri-o://109fd527fd25608545c428f359deeb1c966f2e4b50ff469476da1415629a8e03" gracePeriod=600 Apr 28 19:20:17.901542 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:17.901511 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="cda13ef8-dc6e-4c55-9449-4617c04114b5" containerName="kube-rbac-proxy-web" containerID="cri-o://b541a12388d6eac81a53ff1903770908557c7a5a6bea1e8d489709d95d10f2fe" gracePeriod=600 Apr 28 19:20:17.901778 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:17.901482 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="cda13ef8-dc6e-4c55-9449-4617c04114b5" containerName="thanos-sidecar" containerID="cri-o://790e2919fa10ba5ec8afda0a7969d62bc9ae93389e2db1eb461ed48a8722f5a2" gracePeriod=600 Apr 28 19:20:17.901778 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:17.901462 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="cda13ef8-dc6e-4c55-9449-4617c04114b5" containerName="kube-rbac-proxy" containerID="cri-o://91faf80ce8795ed00a7cccdd26b58734c642baf415f51264d6a483e7071b3683" gracePeriod=600 Apr 28 19:20:17.901778 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:17.901504 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="cda13ef8-dc6e-4c55-9449-4617c04114b5" containerName="config-reloader" containerID="cri-o://6df1f261365589fef57315b77b2b9254bb29c53956289ab8c2897ee5d714d186" gracePeriod=600 Apr 28 19:20:18.148613 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.148589 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:18.233305 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.233235 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cda13ef8-dc6e-4c55-9449-4617c04114b5-configmap-kubelet-serving-ca-bundle\") pod \"cda13ef8-dc6e-4c55-9449-4617c04114b5\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " Apr 28 19:20:18.233305 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.233271 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-secret-grpc-tls\") pod \"cda13ef8-dc6e-4c55-9449-4617c04114b5\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " Apr 28 19:20:18.233305 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.233295 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-secret-metrics-client-certs\") pod \"cda13ef8-dc6e-4c55-9449-4617c04114b5\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " Apr 28 19:20:18.233539 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.233318 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-secret-kube-rbac-proxy\") pod \"cda13ef8-dc6e-4c55-9449-4617c04114b5\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " Apr 28 19:20:18.233539 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.233349 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-web-config\") pod \"cda13ef8-dc6e-4c55-9449-4617c04114b5\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " Apr 28 19:20:18.233539 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.233382 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-thanos-prometheus-http-client-file\") pod \"cda13ef8-dc6e-4c55-9449-4617c04114b5\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " Apr 28 19:20:18.233539 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.233425 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cda13ef8-dc6e-4c55-9449-4617c04114b5-configmap-metrics-client-ca\") pod \"cda13ef8-dc6e-4c55-9449-4617c04114b5\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " Apr 28 19:20:18.233539 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.233451 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cda13ef8-dc6e-4c55-9449-4617c04114b5-prometheus-trusted-ca-bundle\") pod \"cda13ef8-dc6e-4c55-9449-4617c04114b5\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " Apr 28 19:20:18.233539 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.233490 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cda13ef8-dc6e-4c55-9449-4617c04114b5-config-out\") pod \"cda13ef8-dc6e-4c55-9449-4617c04114b5\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " Apr 28 19:20:18.233838 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.233641 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cda13ef8-dc6e-4c55-9449-4617c04114b5-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "cda13ef8-dc6e-4c55-9449-4617c04114b5" (UID: "cda13ef8-dc6e-4c55-9449-4617c04114b5"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:20:18.233981 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.233959 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-secret-prometheus-k8s-tls\") pod \"cda13ef8-dc6e-4c55-9449-4617c04114b5\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " Apr 28 19:20:18.234059 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.233989 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cda13ef8-dc6e-4c55-9449-4617c04114b5-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "cda13ef8-dc6e-4c55-9449-4617c04114b5" (UID: "cda13ef8-dc6e-4c55-9449-4617c04114b5"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:20:18.234059 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.234005 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-config\") pod \"cda13ef8-dc6e-4c55-9449-4617c04114b5\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " Apr 28 19:20:18.234059 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.234037 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvksj\" (UniqueName: \"kubernetes.io/projected/cda13ef8-dc6e-4c55-9449-4617c04114b5-kube-api-access-wvksj\") pod \"cda13ef8-dc6e-4c55-9449-4617c04114b5\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " Apr 28 19:20:18.234217 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.234073 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/cda13ef8-dc6e-4c55-9449-4617c04114b5-prometheus-k8s-db\") pod \"cda13ef8-dc6e-4c55-9449-4617c04114b5\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " Apr 28 19:20:18.234217 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.234104 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cda13ef8-dc6e-4c55-9449-4617c04114b5-prometheus-k8s-rulefiles-0\") pod \"cda13ef8-dc6e-4c55-9449-4617c04114b5\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " Apr 28 19:20:18.234217 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.234148 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cda13ef8-dc6e-4c55-9449-4617c04114b5-tls-assets\") pod \"cda13ef8-dc6e-4c55-9449-4617c04114b5\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " Apr 28 19:20:18.234217 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.234179 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"cda13ef8-dc6e-4c55-9449-4617c04114b5\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " Apr 28 19:20:18.234428 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.234230 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cda13ef8-dc6e-4c55-9449-4617c04114b5-configmap-serving-certs-ca-bundle\") pod \"cda13ef8-dc6e-4c55-9449-4617c04114b5\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " Apr 28 19:20:18.234428 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.234282 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"cda13ef8-dc6e-4c55-9449-4617c04114b5\" (UID: \"cda13ef8-dc6e-4c55-9449-4617c04114b5\") " Apr 28 19:20:18.234914 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.234577 2565 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cda13ef8-dc6e-4c55-9449-4617c04114b5-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-138-119.ec2.internal\" DevicePath \"\"" Apr 28 19:20:18.234914 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.234603 2565 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cda13ef8-dc6e-4c55-9449-4617c04114b5-configmap-metrics-client-ca\") on node \"ip-10-0-138-119.ec2.internal\" DevicePath \"\"" Apr 28 19:20:18.235091 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.234950 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cda13ef8-dc6e-4c55-9449-4617c04114b5-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "cda13ef8-dc6e-4c55-9449-4617c04114b5" (UID: "cda13ef8-dc6e-4c55-9449-4617c04114b5"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:20:18.235509 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.235481 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cda13ef8-dc6e-4c55-9449-4617c04114b5-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "cda13ef8-dc6e-4c55-9449-4617c04114b5" (UID: "cda13ef8-dc6e-4c55-9449-4617c04114b5"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:20:18.235860 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.235829 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cda13ef8-dc6e-4c55-9449-4617c04114b5-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "cda13ef8-dc6e-4c55-9449-4617c04114b5" (UID: "cda13ef8-dc6e-4c55-9449-4617c04114b5"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:20:18.236476 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.236438 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cda13ef8-dc6e-4c55-9449-4617c04114b5-config-out" (OuterVolumeSpecName: "config-out") pod "cda13ef8-dc6e-4c55-9449-4617c04114b5" (UID: "cda13ef8-dc6e-4c55-9449-4617c04114b5"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:20:18.236571 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.236483 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cda13ef8-dc6e-4c55-9449-4617c04114b5-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "cda13ef8-dc6e-4c55-9449-4617c04114b5" (UID: "cda13ef8-dc6e-4c55-9449-4617c04114b5"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:20:18.237360 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.237324 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cda13ef8-dc6e-4c55-9449-4617c04114b5-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "cda13ef8-dc6e-4c55-9449-4617c04114b5" (UID: "cda13ef8-dc6e-4c55-9449-4617c04114b5"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:20:18.237455 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.237404 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "cda13ef8-dc6e-4c55-9449-4617c04114b5" (UID: "cda13ef8-dc6e-4c55-9449-4617c04114b5"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:20:18.237455 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.237429 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "cda13ef8-dc6e-4c55-9449-4617c04114b5" (UID: "cda13ef8-dc6e-4c55-9449-4617c04114b5"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:20:18.237583 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.237472 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "cda13ef8-dc6e-4c55-9449-4617c04114b5" (UID: "cda13ef8-dc6e-4c55-9449-4617c04114b5"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:20:18.237583 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.237486 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "cda13ef8-dc6e-4c55-9449-4617c04114b5" (UID: "cda13ef8-dc6e-4c55-9449-4617c04114b5"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:20:18.237856 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.237819 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "cda13ef8-dc6e-4c55-9449-4617c04114b5" (UID: "cda13ef8-dc6e-4c55-9449-4617c04114b5"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:20:18.237962 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.237881 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "cda13ef8-dc6e-4c55-9449-4617c04114b5" (UID: "cda13ef8-dc6e-4c55-9449-4617c04114b5"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:20:18.238020 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.237979 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-config" (OuterVolumeSpecName: "config") pod "cda13ef8-dc6e-4c55-9449-4617c04114b5" (UID: "cda13ef8-dc6e-4c55-9449-4617c04114b5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:20:18.238711 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.238685 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cda13ef8-dc6e-4c55-9449-4617c04114b5-kube-api-access-wvksj" (OuterVolumeSpecName: "kube-api-access-wvksj") pod "cda13ef8-dc6e-4c55-9449-4617c04114b5" (UID: "cda13ef8-dc6e-4c55-9449-4617c04114b5"). InnerVolumeSpecName "kube-api-access-wvksj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:20:18.238928 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.238881 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "cda13ef8-dc6e-4c55-9449-4617c04114b5" (UID: "cda13ef8-dc6e-4c55-9449-4617c04114b5"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:20:18.247924 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.247873 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-web-config" (OuterVolumeSpecName: "web-config") pod "cda13ef8-dc6e-4c55-9449-4617c04114b5" (UID: "cda13ef8-dc6e-4c55-9449-4617c04114b5"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:20:18.335648 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.335627 2565 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-config\") on node \"ip-10-0-138-119.ec2.internal\" DevicePath \"\"" Apr 28 19:20:18.335648 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.335648 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wvksj\" (UniqueName: \"kubernetes.io/projected/cda13ef8-dc6e-4c55-9449-4617c04114b5-kube-api-access-wvksj\") on node \"ip-10-0-138-119.ec2.internal\" DevicePath \"\"" Apr 28 19:20:18.335767 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.335659 2565 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/cda13ef8-dc6e-4c55-9449-4617c04114b5-prometheus-k8s-db\") on node \"ip-10-0-138-119.ec2.internal\" DevicePath \"\"" Apr 28 19:20:18.335767 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.335668 2565 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cda13ef8-dc6e-4c55-9449-4617c04114b5-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-138-119.ec2.internal\" DevicePath \"\"" Apr 28 19:20:18.335767 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.335678 2565 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cda13ef8-dc6e-4c55-9449-4617c04114b5-tls-assets\") on node \"ip-10-0-138-119.ec2.internal\" DevicePath \"\"" Apr 28 19:20:18.335767 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.335687 2565 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-138-119.ec2.internal\" DevicePath \"\"" Apr 28 19:20:18.335767 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.335697 2565 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cda13ef8-dc6e-4c55-9449-4617c04114b5-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-138-119.ec2.internal\" DevicePath \"\"" Apr 28 19:20:18.335767 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.335706 2565 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-138-119.ec2.internal\" DevicePath \"\"" Apr 28 19:20:18.335767 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.335715 2565 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-secret-grpc-tls\") on node \"ip-10-0-138-119.ec2.internal\" DevicePath \"\"" Apr 28 19:20:18.335767 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.335724 2565 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-secret-metrics-client-certs\") on node \"ip-10-0-138-119.ec2.internal\" DevicePath \"\"" Apr 28 19:20:18.335767 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.335732 2565 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-secret-kube-rbac-proxy\") on node \"ip-10-0-138-119.ec2.internal\" DevicePath \"\"" Apr 28 19:20:18.335767 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.335740 2565 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-web-config\") on node \"ip-10-0-138-119.ec2.internal\" DevicePath \"\"" Apr 28 19:20:18.335767 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.335751 2565 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-thanos-prometheus-http-client-file\") on node \"ip-10-0-138-119.ec2.internal\" DevicePath \"\"" Apr 28 19:20:18.335767 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.335760 2565 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cda13ef8-dc6e-4c55-9449-4617c04114b5-prometheus-trusted-ca-bundle\") on node \"ip-10-0-138-119.ec2.internal\" DevicePath \"\"" Apr 28 19:20:18.335767 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.335770 2565 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cda13ef8-dc6e-4c55-9449-4617c04114b5-config-out\") on node \"ip-10-0-138-119.ec2.internal\" DevicePath \"\"" Apr 28 19:20:18.336148 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.335780 2565 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/cda13ef8-dc6e-4c55-9449-4617c04114b5-secret-prometheus-k8s-tls\") on node \"ip-10-0-138-119.ec2.internal\" DevicePath \"\"" Apr 28 19:20:18.981991 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.981956 2565 generic.go:358] "Generic (PLEG): container finished" podID="cda13ef8-dc6e-4c55-9449-4617c04114b5" containerID="109fd527fd25608545c428f359deeb1c966f2e4b50ff469476da1415629a8e03" exitCode=0 Apr 28 19:20:18.981991 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.981981 2565 generic.go:358] "Generic (PLEG): container finished" podID="cda13ef8-dc6e-4c55-9449-4617c04114b5" containerID="91faf80ce8795ed00a7cccdd26b58734c642baf415f51264d6a483e7071b3683" exitCode=0 Apr 28 19:20:18.981991 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.981987 2565 generic.go:358] "Generic (PLEG): container finished" podID="cda13ef8-dc6e-4c55-9449-4617c04114b5" containerID="b541a12388d6eac81a53ff1903770908557c7a5a6bea1e8d489709d95d10f2fe" exitCode=0 Apr 28 19:20:18.981991 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.981993 2565 generic.go:358] "Generic (PLEG): container finished" podID="cda13ef8-dc6e-4c55-9449-4617c04114b5" containerID="790e2919fa10ba5ec8afda0a7969d62bc9ae93389e2db1eb461ed48a8722f5a2" exitCode=0 Apr 28 19:20:18.981991 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.981999 2565 generic.go:358] "Generic (PLEG): container finished" podID="cda13ef8-dc6e-4c55-9449-4617c04114b5" containerID="6df1f261365589fef57315b77b2b9254bb29c53956289ab8c2897ee5d714d186" exitCode=0 Apr 28 19:20:18.982347 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.982004 2565 generic.go:358] "Generic (PLEG): container finished" podID="cda13ef8-dc6e-4c55-9449-4617c04114b5" containerID="280bce37e65283ecc036af4224a36cf2a57fe5fc2535712764d6b23a03718791" exitCode=0 Apr 28 19:20:18.982347 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.982033 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cda13ef8-dc6e-4c55-9449-4617c04114b5","Type":"ContainerDied","Data":"109fd527fd25608545c428f359deeb1c966f2e4b50ff469476da1415629a8e03"} Apr 28 19:20:18.982347 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.982074 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cda13ef8-dc6e-4c55-9449-4617c04114b5","Type":"ContainerDied","Data":"91faf80ce8795ed00a7cccdd26b58734c642baf415f51264d6a483e7071b3683"} Apr 28 19:20:18.982347 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.982083 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:18.982347 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.982096 2565 scope.go:117] "RemoveContainer" containerID="109fd527fd25608545c428f359deeb1c966f2e4b50ff469476da1415629a8e03" Apr 28 19:20:18.982347 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.982086 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cda13ef8-dc6e-4c55-9449-4617c04114b5","Type":"ContainerDied","Data":"b541a12388d6eac81a53ff1903770908557c7a5a6bea1e8d489709d95d10f2fe"} Apr 28 19:20:18.982347 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.982212 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cda13ef8-dc6e-4c55-9449-4617c04114b5","Type":"ContainerDied","Data":"790e2919fa10ba5ec8afda0a7969d62bc9ae93389e2db1eb461ed48a8722f5a2"} Apr 28 19:20:18.982347 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.982230 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cda13ef8-dc6e-4c55-9449-4617c04114b5","Type":"ContainerDied","Data":"6df1f261365589fef57315b77b2b9254bb29c53956289ab8c2897ee5d714d186"} Apr 28 19:20:18.982347 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.982246 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cda13ef8-dc6e-4c55-9449-4617c04114b5","Type":"ContainerDied","Data":"280bce37e65283ecc036af4224a36cf2a57fe5fc2535712764d6b23a03718791"} Apr 28 19:20:18.982347 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.982261 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cda13ef8-dc6e-4c55-9449-4617c04114b5","Type":"ContainerDied","Data":"2a509f7f70147e4b5f9c4b55046e76a2e39d66912f4a8a81578a90295938d340"} Apr 28 19:20:18.989591 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.989570 2565 scope.go:117] "RemoveContainer" containerID="91faf80ce8795ed00a7cccdd26b58734c642baf415f51264d6a483e7071b3683" Apr 28 19:20:18.998218 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:18.998198 2565 scope.go:117] "RemoveContainer" containerID="b541a12388d6eac81a53ff1903770908557c7a5a6bea1e8d489709d95d10f2fe" Apr 28 19:20:19.004373 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.004356 2565 scope.go:117] "RemoveContainer" containerID="790e2919fa10ba5ec8afda0a7969d62bc9ae93389e2db1eb461ed48a8722f5a2" Apr 28 19:20:19.008826 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.008801 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 28 19:20:19.013748 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.013728 2565 scope.go:117] "RemoveContainer" containerID="6df1f261365589fef57315b77b2b9254bb29c53956289ab8c2897ee5d714d186" Apr 28 19:20:19.016591 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.016572 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 28 19:20:19.021706 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.021686 2565 scope.go:117] "RemoveContainer" containerID="280bce37e65283ecc036af4224a36cf2a57fe5fc2535712764d6b23a03718791" Apr 28 19:20:19.028215 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.028201 2565 scope.go:117] "RemoveContainer" containerID="5ed81fcd5d5104b0bbfa4d4b227934a838b1eefed8091befd55c59bed5b1f81a" Apr 28 19:20:19.034290 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.034275 2565 scope.go:117] "RemoveContainer" containerID="109fd527fd25608545c428f359deeb1c966f2e4b50ff469476da1415629a8e03" Apr 28 19:20:19.034562 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:20:19.034540 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"109fd527fd25608545c428f359deeb1c966f2e4b50ff469476da1415629a8e03\": container with ID starting with 109fd527fd25608545c428f359deeb1c966f2e4b50ff469476da1415629a8e03 not found: ID does not exist" containerID="109fd527fd25608545c428f359deeb1c966f2e4b50ff469476da1415629a8e03" Apr 28 19:20:19.034617 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.034572 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"109fd527fd25608545c428f359deeb1c966f2e4b50ff469476da1415629a8e03"} err="failed to get container status \"109fd527fd25608545c428f359deeb1c966f2e4b50ff469476da1415629a8e03\": rpc error: code = NotFound desc = could not find container \"109fd527fd25608545c428f359deeb1c966f2e4b50ff469476da1415629a8e03\": container with ID starting with 109fd527fd25608545c428f359deeb1c966f2e4b50ff469476da1415629a8e03 not found: ID does not exist" Apr 28 19:20:19.034617 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.034590 2565 scope.go:117] "RemoveContainer" containerID="91faf80ce8795ed00a7cccdd26b58734c642baf415f51264d6a483e7071b3683" Apr 28 19:20:19.034812 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:20:19.034794 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91faf80ce8795ed00a7cccdd26b58734c642baf415f51264d6a483e7071b3683\": container with ID starting with 91faf80ce8795ed00a7cccdd26b58734c642baf415f51264d6a483e7071b3683 not found: ID does not exist" containerID="91faf80ce8795ed00a7cccdd26b58734c642baf415f51264d6a483e7071b3683" Apr 28 19:20:19.034855 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.034816 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91faf80ce8795ed00a7cccdd26b58734c642baf415f51264d6a483e7071b3683"} err="failed to get container status \"91faf80ce8795ed00a7cccdd26b58734c642baf415f51264d6a483e7071b3683\": rpc error: code = NotFound desc = could not find container \"91faf80ce8795ed00a7cccdd26b58734c642baf415f51264d6a483e7071b3683\": container with ID starting with 91faf80ce8795ed00a7cccdd26b58734c642baf415f51264d6a483e7071b3683 not found: ID does not exist" Apr 28 19:20:19.034855 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.034832 2565 scope.go:117] "RemoveContainer" containerID="b541a12388d6eac81a53ff1903770908557c7a5a6bea1e8d489709d95d10f2fe" Apr 28 19:20:19.035041 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:20:19.035025 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b541a12388d6eac81a53ff1903770908557c7a5a6bea1e8d489709d95d10f2fe\": container with ID starting with b541a12388d6eac81a53ff1903770908557c7a5a6bea1e8d489709d95d10f2fe not found: ID does not exist" containerID="b541a12388d6eac81a53ff1903770908557c7a5a6bea1e8d489709d95d10f2fe" Apr 28 19:20:19.035097 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.035049 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b541a12388d6eac81a53ff1903770908557c7a5a6bea1e8d489709d95d10f2fe"} err="failed to get container status \"b541a12388d6eac81a53ff1903770908557c7a5a6bea1e8d489709d95d10f2fe\": rpc error: code = NotFound desc = could not find container \"b541a12388d6eac81a53ff1903770908557c7a5a6bea1e8d489709d95d10f2fe\": container with ID starting with b541a12388d6eac81a53ff1903770908557c7a5a6bea1e8d489709d95d10f2fe not found: ID does not exist" Apr 28 19:20:19.035097 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.035070 2565 scope.go:117] "RemoveContainer" containerID="790e2919fa10ba5ec8afda0a7969d62bc9ae93389e2db1eb461ed48a8722f5a2" Apr 28 19:20:19.035314 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:20:19.035299 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"790e2919fa10ba5ec8afda0a7969d62bc9ae93389e2db1eb461ed48a8722f5a2\": container with ID starting with 790e2919fa10ba5ec8afda0a7969d62bc9ae93389e2db1eb461ed48a8722f5a2 not found: ID does not exist" containerID="790e2919fa10ba5ec8afda0a7969d62bc9ae93389e2db1eb461ed48a8722f5a2" Apr 28 19:20:19.035354 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.035321 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"790e2919fa10ba5ec8afda0a7969d62bc9ae93389e2db1eb461ed48a8722f5a2"} err="failed to get container status \"790e2919fa10ba5ec8afda0a7969d62bc9ae93389e2db1eb461ed48a8722f5a2\": rpc error: code = NotFound desc = could not find container \"790e2919fa10ba5ec8afda0a7969d62bc9ae93389e2db1eb461ed48a8722f5a2\": container with ID starting with 790e2919fa10ba5ec8afda0a7969d62bc9ae93389e2db1eb461ed48a8722f5a2 not found: ID does not exist" Apr 28 19:20:19.035354 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.035338 2565 scope.go:117] "RemoveContainer" containerID="6df1f261365589fef57315b77b2b9254bb29c53956289ab8c2897ee5d714d186" Apr 28 19:20:19.035568 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:20:19.035550 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6df1f261365589fef57315b77b2b9254bb29c53956289ab8c2897ee5d714d186\": container with ID starting with 6df1f261365589fef57315b77b2b9254bb29c53956289ab8c2897ee5d714d186 not found: ID does not exist" containerID="6df1f261365589fef57315b77b2b9254bb29c53956289ab8c2897ee5d714d186" Apr 28 19:20:19.035651 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.035569 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6df1f261365589fef57315b77b2b9254bb29c53956289ab8c2897ee5d714d186"} err="failed to get container status \"6df1f261365589fef57315b77b2b9254bb29c53956289ab8c2897ee5d714d186\": rpc error: code = NotFound desc = could not find container \"6df1f261365589fef57315b77b2b9254bb29c53956289ab8c2897ee5d714d186\": container with ID starting with 6df1f261365589fef57315b77b2b9254bb29c53956289ab8c2897ee5d714d186 not found: ID does not exist" Apr 28 19:20:19.035651 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.035583 2565 scope.go:117] "RemoveContainer" containerID="280bce37e65283ecc036af4224a36cf2a57fe5fc2535712764d6b23a03718791" Apr 28 19:20:19.035771 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:20:19.035757 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"280bce37e65283ecc036af4224a36cf2a57fe5fc2535712764d6b23a03718791\": container with ID starting with 280bce37e65283ecc036af4224a36cf2a57fe5fc2535712764d6b23a03718791 not found: ID does not exist" containerID="280bce37e65283ecc036af4224a36cf2a57fe5fc2535712764d6b23a03718791" Apr 28 19:20:19.035813 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.035773 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"280bce37e65283ecc036af4224a36cf2a57fe5fc2535712764d6b23a03718791"} err="failed to get container status \"280bce37e65283ecc036af4224a36cf2a57fe5fc2535712764d6b23a03718791\": rpc error: code = NotFound desc = could not find container \"280bce37e65283ecc036af4224a36cf2a57fe5fc2535712764d6b23a03718791\": container with ID starting with 280bce37e65283ecc036af4224a36cf2a57fe5fc2535712764d6b23a03718791 not found: ID does not exist" Apr 28 19:20:19.035813 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.035785 2565 scope.go:117] "RemoveContainer" containerID="5ed81fcd5d5104b0bbfa4d4b227934a838b1eefed8091befd55c59bed5b1f81a" Apr 28 19:20:19.036024 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:20:19.036004 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ed81fcd5d5104b0bbfa4d4b227934a838b1eefed8091befd55c59bed5b1f81a\": container with ID starting with 5ed81fcd5d5104b0bbfa4d4b227934a838b1eefed8091befd55c59bed5b1f81a not found: ID does not exist" containerID="5ed81fcd5d5104b0bbfa4d4b227934a838b1eefed8091befd55c59bed5b1f81a" Apr 28 19:20:19.036086 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.036027 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ed81fcd5d5104b0bbfa4d4b227934a838b1eefed8091befd55c59bed5b1f81a"} err="failed to get container status \"5ed81fcd5d5104b0bbfa4d4b227934a838b1eefed8091befd55c59bed5b1f81a\": rpc error: code = NotFound desc = could not find container \"5ed81fcd5d5104b0bbfa4d4b227934a838b1eefed8091befd55c59bed5b1f81a\": container with ID starting with 5ed81fcd5d5104b0bbfa4d4b227934a838b1eefed8091befd55c59bed5b1f81a not found: ID does not exist" Apr 28 19:20:19.036086 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.036041 2565 scope.go:117] "RemoveContainer" containerID="109fd527fd25608545c428f359deeb1c966f2e4b50ff469476da1415629a8e03" Apr 28 19:20:19.036269 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.036249 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"109fd527fd25608545c428f359deeb1c966f2e4b50ff469476da1415629a8e03"} err="failed to get container status \"109fd527fd25608545c428f359deeb1c966f2e4b50ff469476da1415629a8e03\": rpc error: code = NotFound desc = could not find container \"109fd527fd25608545c428f359deeb1c966f2e4b50ff469476da1415629a8e03\": container with ID starting with 109fd527fd25608545c428f359deeb1c966f2e4b50ff469476da1415629a8e03 not found: ID does not exist" Apr 28 19:20:19.036322 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.036271 2565 scope.go:117] "RemoveContainer" containerID="91faf80ce8795ed00a7cccdd26b58734c642baf415f51264d6a483e7071b3683" Apr 28 19:20:19.036455 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.036437 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91faf80ce8795ed00a7cccdd26b58734c642baf415f51264d6a483e7071b3683"} err="failed to get container status \"91faf80ce8795ed00a7cccdd26b58734c642baf415f51264d6a483e7071b3683\": rpc error: code = NotFound desc = could not find container \"91faf80ce8795ed00a7cccdd26b58734c642baf415f51264d6a483e7071b3683\": container with ID starting with 91faf80ce8795ed00a7cccdd26b58734c642baf415f51264d6a483e7071b3683 not found: ID does not exist" Apr 28 19:20:19.036501 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.036457 2565 scope.go:117] "RemoveContainer" containerID="b541a12388d6eac81a53ff1903770908557c7a5a6bea1e8d489709d95d10f2fe" Apr 28 19:20:19.036647 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.036630 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b541a12388d6eac81a53ff1903770908557c7a5a6bea1e8d489709d95d10f2fe"} err="failed to get container status \"b541a12388d6eac81a53ff1903770908557c7a5a6bea1e8d489709d95d10f2fe\": rpc error: code = NotFound desc = could not find container \"b541a12388d6eac81a53ff1903770908557c7a5a6bea1e8d489709d95d10f2fe\": container with ID starting with b541a12388d6eac81a53ff1903770908557c7a5a6bea1e8d489709d95d10f2fe not found: ID does not exist" Apr 28 19:20:19.036693 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.036648 2565 scope.go:117] "RemoveContainer" containerID="790e2919fa10ba5ec8afda0a7969d62bc9ae93389e2db1eb461ed48a8722f5a2" Apr 28 19:20:19.036835 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.036814 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"790e2919fa10ba5ec8afda0a7969d62bc9ae93389e2db1eb461ed48a8722f5a2"} err="failed to get container status \"790e2919fa10ba5ec8afda0a7969d62bc9ae93389e2db1eb461ed48a8722f5a2\": rpc error: code = NotFound desc = could not find container \"790e2919fa10ba5ec8afda0a7969d62bc9ae93389e2db1eb461ed48a8722f5a2\": container with ID starting with 790e2919fa10ba5ec8afda0a7969d62bc9ae93389e2db1eb461ed48a8722f5a2 not found: ID does not exist" Apr 28 19:20:19.036916 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.036837 2565 scope.go:117] "RemoveContainer" containerID="6df1f261365589fef57315b77b2b9254bb29c53956289ab8c2897ee5d714d186" Apr 28 19:20:19.037075 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.037054 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6df1f261365589fef57315b77b2b9254bb29c53956289ab8c2897ee5d714d186"} err="failed to get container status \"6df1f261365589fef57315b77b2b9254bb29c53956289ab8c2897ee5d714d186\": rpc error: code = NotFound desc = could not find container \"6df1f261365589fef57315b77b2b9254bb29c53956289ab8c2897ee5d714d186\": container with ID starting with 6df1f261365589fef57315b77b2b9254bb29c53956289ab8c2897ee5d714d186 not found: ID does not exist" Apr 28 19:20:19.037138 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.037078 2565 scope.go:117] "RemoveContainer" containerID="280bce37e65283ecc036af4224a36cf2a57fe5fc2535712764d6b23a03718791" Apr 28 19:20:19.037325 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.037309 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"280bce37e65283ecc036af4224a36cf2a57fe5fc2535712764d6b23a03718791"} err="failed to get container status \"280bce37e65283ecc036af4224a36cf2a57fe5fc2535712764d6b23a03718791\": rpc error: code = NotFound desc = could not find container \"280bce37e65283ecc036af4224a36cf2a57fe5fc2535712764d6b23a03718791\": container with ID starting with 280bce37e65283ecc036af4224a36cf2a57fe5fc2535712764d6b23a03718791 not found: ID does not exist" Apr 28 19:20:19.037385 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.037328 2565 scope.go:117] "RemoveContainer" containerID="5ed81fcd5d5104b0bbfa4d4b227934a838b1eefed8091befd55c59bed5b1f81a" Apr 28 19:20:19.037880 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.037855 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ed81fcd5d5104b0bbfa4d4b227934a838b1eefed8091befd55c59bed5b1f81a"} err="failed to get container status \"5ed81fcd5d5104b0bbfa4d4b227934a838b1eefed8091befd55c59bed5b1f81a\": rpc error: code = NotFound desc = could not find container \"5ed81fcd5d5104b0bbfa4d4b227934a838b1eefed8091befd55c59bed5b1f81a\": container with ID starting with 5ed81fcd5d5104b0bbfa4d4b227934a838b1eefed8091befd55c59bed5b1f81a not found: ID does not exist" Apr 28 19:20:19.037993 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.037889 2565 scope.go:117] "RemoveContainer" containerID="109fd527fd25608545c428f359deeb1c966f2e4b50ff469476da1415629a8e03" Apr 28 19:20:19.038221 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.038202 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"109fd527fd25608545c428f359deeb1c966f2e4b50ff469476da1415629a8e03"} err="failed to get container status \"109fd527fd25608545c428f359deeb1c966f2e4b50ff469476da1415629a8e03\": rpc error: code = NotFound desc = could not find container \"109fd527fd25608545c428f359deeb1c966f2e4b50ff469476da1415629a8e03\": container with ID starting with 109fd527fd25608545c428f359deeb1c966f2e4b50ff469476da1415629a8e03 not found: ID does not exist" Apr 28 19:20:19.038294 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.038221 2565 scope.go:117] "RemoveContainer" containerID="91faf80ce8795ed00a7cccdd26b58734c642baf415f51264d6a483e7071b3683" Apr 28 19:20:19.038474 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.038446 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91faf80ce8795ed00a7cccdd26b58734c642baf415f51264d6a483e7071b3683"} err="failed to get container status \"91faf80ce8795ed00a7cccdd26b58734c642baf415f51264d6a483e7071b3683\": rpc error: code = NotFound desc = could not find container \"91faf80ce8795ed00a7cccdd26b58734c642baf415f51264d6a483e7071b3683\": container with ID starting with 91faf80ce8795ed00a7cccdd26b58734c642baf415f51264d6a483e7071b3683 not found: ID does not exist" Apr 28 19:20:19.038546 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.038477 2565 scope.go:117] "RemoveContainer" containerID="b541a12388d6eac81a53ff1903770908557c7a5a6bea1e8d489709d95d10f2fe" Apr 28 19:20:19.038720 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.038699 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b541a12388d6eac81a53ff1903770908557c7a5a6bea1e8d489709d95d10f2fe"} err="failed to get container status \"b541a12388d6eac81a53ff1903770908557c7a5a6bea1e8d489709d95d10f2fe\": rpc error: code = NotFound desc = could not find container \"b541a12388d6eac81a53ff1903770908557c7a5a6bea1e8d489709d95d10f2fe\": container with ID starting with b541a12388d6eac81a53ff1903770908557c7a5a6bea1e8d489709d95d10f2fe not found: ID does not exist" Apr 28 19:20:19.038771 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.038723 2565 scope.go:117] "RemoveContainer" containerID="790e2919fa10ba5ec8afda0a7969d62bc9ae93389e2db1eb461ed48a8722f5a2" Apr 28 19:20:19.039009 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.038984 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"790e2919fa10ba5ec8afda0a7969d62bc9ae93389e2db1eb461ed48a8722f5a2"} err="failed to get container status \"790e2919fa10ba5ec8afda0a7969d62bc9ae93389e2db1eb461ed48a8722f5a2\": rpc error: code = NotFound desc = could not find container \"790e2919fa10ba5ec8afda0a7969d62bc9ae93389e2db1eb461ed48a8722f5a2\": container with ID starting with 790e2919fa10ba5ec8afda0a7969d62bc9ae93389e2db1eb461ed48a8722f5a2 not found: ID does not exist" Apr 28 19:20:19.039009 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.039006 2565 scope.go:117] "RemoveContainer" containerID="6df1f261365589fef57315b77b2b9254bb29c53956289ab8c2897ee5d714d186" Apr 28 19:20:19.039160 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.039064 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 28 19:20:19.039228 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.039209 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6df1f261365589fef57315b77b2b9254bb29c53956289ab8c2897ee5d714d186"} err="failed to get container status \"6df1f261365589fef57315b77b2b9254bb29c53956289ab8c2897ee5d714d186\": rpc error: code = NotFound desc = could not find container \"6df1f261365589fef57315b77b2b9254bb29c53956289ab8c2897ee5d714d186\": container with ID starting with 6df1f261365589fef57315b77b2b9254bb29c53956289ab8c2897ee5d714d186 not found: ID does not exist" Apr 28 19:20:19.039281 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.039230 2565 scope.go:117] "RemoveContainer" containerID="280bce37e65283ecc036af4224a36cf2a57fe5fc2535712764d6b23a03718791" Apr 28 19:20:19.039416 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.039400 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cda13ef8-dc6e-4c55-9449-4617c04114b5" containerName="prometheus" Apr 28 19:20:19.039469 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.039419 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="cda13ef8-dc6e-4c55-9449-4617c04114b5" containerName="prometheus" Apr 28 19:20:19.039469 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.039426 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cda13ef8-dc6e-4c55-9449-4617c04114b5" containerName="kube-rbac-proxy" Apr 28 19:20:19.039469 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.039432 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="cda13ef8-dc6e-4c55-9449-4617c04114b5" containerName="kube-rbac-proxy" Apr 28 19:20:19.039469 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.039438 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cda13ef8-dc6e-4c55-9449-4617c04114b5" containerName="kube-rbac-proxy-thanos" Apr 28 19:20:19.039469 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.039443 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="cda13ef8-dc6e-4c55-9449-4617c04114b5" containerName="kube-rbac-proxy-thanos" Apr 28 19:20:19.039469 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.039457 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cda13ef8-dc6e-4c55-9449-4617c04114b5" containerName="kube-rbac-proxy-web" Apr 28 19:20:19.039469 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.039462 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="cda13ef8-dc6e-4c55-9449-4617c04114b5" containerName="kube-rbac-proxy-web" Apr 28 19:20:19.039469 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.039436 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"280bce37e65283ecc036af4224a36cf2a57fe5fc2535712764d6b23a03718791"} err="failed to get container status \"280bce37e65283ecc036af4224a36cf2a57fe5fc2535712764d6b23a03718791\": rpc error: code = NotFound desc = could not find container \"280bce37e65283ecc036af4224a36cf2a57fe5fc2535712764d6b23a03718791\": container with ID starting with 280bce37e65283ecc036af4224a36cf2a57fe5fc2535712764d6b23a03718791 not found: ID does not exist" Apr 28 19:20:19.039469 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.039472 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cda13ef8-dc6e-4c55-9449-4617c04114b5" containerName="init-config-reloader" Apr 28 19:20:19.039873 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.039478 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="cda13ef8-dc6e-4c55-9449-4617c04114b5" containerName="init-config-reloader" Apr 28 19:20:19.039873 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.039478 2565 scope.go:117] "RemoveContainer" containerID="5ed81fcd5d5104b0bbfa4d4b227934a838b1eefed8091befd55c59bed5b1f81a" Apr 28 19:20:19.039873 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.039486 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cda13ef8-dc6e-4c55-9449-4617c04114b5" containerName="thanos-sidecar" Apr 28 19:20:19.039873 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.039492 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="cda13ef8-dc6e-4c55-9449-4617c04114b5" containerName="thanos-sidecar" Apr 28 19:20:19.039873 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.039498 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cda13ef8-dc6e-4c55-9449-4617c04114b5" containerName="config-reloader" Apr 28 19:20:19.039873 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.039503 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="cda13ef8-dc6e-4c55-9449-4617c04114b5" containerName="config-reloader" Apr 28 19:20:19.039873 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.039569 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="cda13ef8-dc6e-4c55-9449-4617c04114b5" containerName="kube-rbac-proxy-web" Apr 28 19:20:19.039873 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.039582 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="cda13ef8-dc6e-4c55-9449-4617c04114b5" containerName="kube-rbac-proxy-thanos" Apr 28 19:20:19.039873 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.039592 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="cda13ef8-dc6e-4c55-9449-4617c04114b5" containerName="thanos-sidecar" Apr 28 19:20:19.039873 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.039602 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="cda13ef8-dc6e-4c55-9449-4617c04114b5" containerName="kube-rbac-proxy" Apr 28 19:20:19.039873 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.039614 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="cda13ef8-dc6e-4c55-9449-4617c04114b5" containerName="prometheus" Apr 28 19:20:19.039873 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.039623 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="cda13ef8-dc6e-4c55-9449-4617c04114b5" containerName="config-reloader" Apr 28 19:20:19.039873 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.039682 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ed81fcd5d5104b0bbfa4d4b227934a838b1eefed8091befd55c59bed5b1f81a"} err="failed to get container status \"5ed81fcd5d5104b0bbfa4d4b227934a838b1eefed8091befd55c59bed5b1f81a\": rpc error: code = NotFound desc = could not find container \"5ed81fcd5d5104b0bbfa4d4b227934a838b1eefed8091befd55c59bed5b1f81a\": container with ID starting with 5ed81fcd5d5104b0bbfa4d4b227934a838b1eefed8091befd55c59bed5b1f81a not found: ID does not exist" Apr 28 19:20:19.039873 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.039697 2565 scope.go:117] "RemoveContainer" containerID="109fd527fd25608545c428f359deeb1c966f2e4b50ff469476da1415629a8e03" Apr 28 19:20:19.040373 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.039938 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"109fd527fd25608545c428f359deeb1c966f2e4b50ff469476da1415629a8e03"} err="failed to get container status \"109fd527fd25608545c428f359deeb1c966f2e4b50ff469476da1415629a8e03\": rpc error: code = NotFound desc = could not find container \"109fd527fd25608545c428f359deeb1c966f2e4b50ff469476da1415629a8e03\": container with ID starting with 109fd527fd25608545c428f359deeb1c966f2e4b50ff469476da1415629a8e03 not found: ID does not exist" Apr 28 19:20:19.040373 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.039960 2565 scope.go:117] "RemoveContainer" containerID="91faf80ce8795ed00a7cccdd26b58734c642baf415f51264d6a483e7071b3683" Apr 28 19:20:19.040373 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.040174 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91faf80ce8795ed00a7cccdd26b58734c642baf415f51264d6a483e7071b3683"} err="failed to get container status \"91faf80ce8795ed00a7cccdd26b58734c642baf415f51264d6a483e7071b3683\": rpc error: code = NotFound desc = could not find container \"91faf80ce8795ed00a7cccdd26b58734c642baf415f51264d6a483e7071b3683\": container with ID starting with 91faf80ce8795ed00a7cccdd26b58734c642baf415f51264d6a483e7071b3683 not found: ID does not exist" Apr 28 19:20:19.040373 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.040189 2565 scope.go:117] "RemoveContainer" containerID="b541a12388d6eac81a53ff1903770908557c7a5a6bea1e8d489709d95d10f2fe" Apr 28 19:20:19.040510 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.040378 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b541a12388d6eac81a53ff1903770908557c7a5a6bea1e8d489709d95d10f2fe"} err="failed to get container status \"b541a12388d6eac81a53ff1903770908557c7a5a6bea1e8d489709d95d10f2fe\": rpc error: code = NotFound desc = could not find container \"b541a12388d6eac81a53ff1903770908557c7a5a6bea1e8d489709d95d10f2fe\": container with ID starting with b541a12388d6eac81a53ff1903770908557c7a5a6bea1e8d489709d95d10f2fe not found: ID does not exist" Apr 28 19:20:19.040510 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.040393 2565 scope.go:117] "RemoveContainer" containerID="790e2919fa10ba5ec8afda0a7969d62bc9ae93389e2db1eb461ed48a8722f5a2" Apr 28 19:20:19.040607 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.040587 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"790e2919fa10ba5ec8afda0a7969d62bc9ae93389e2db1eb461ed48a8722f5a2"} err="failed to get container status \"790e2919fa10ba5ec8afda0a7969d62bc9ae93389e2db1eb461ed48a8722f5a2\": rpc error: code = NotFound desc = could not find container \"790e2919fa10ba5ec8afda0a7969d62bc9ae93389e2db1eb461ed48a8722f5a2\": container with ID starting with 790e2919fa10ba5ec8afda0a7969d62bc9ae93389e2db1eb461ed48a8722f5a2 not found: ID does not exist" Apr 28 19:20:19.040641 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.040608 2565 scope.go:117] "RemoveContainer" containerID="6df1f261365589fef57315b77b2b9254bb29c53956289ab8c2897ee5d714d186" Apr 28 19:20:19.040805 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.040786 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6df1f261365589fef57315b77b2b9254bb29c53956289ab8c2897ee5d714d186"} err="failed to get container status \"6df1f261365589fef57315b77b2b9254bb29c53956289ab8c2897ee5d714d186\": rpc error: code = NotFound desc = could not find container \"6df1f261365589fef57315b77b2b9254bb29c53956289ab8c2897ee5d714d186\": container with ID starting with 6df1f261365589fef57315b77b2b9254bb29c53956289ab8c2897ee5d714d186 not found: ID does not exist" Apr 28 19:20:19.040879 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.040807 2565 scope.go:117] "RemoveContainer" containerID="280bce37e65283ecc036af4224a36cf2a57fe5fc2535712764d6b23a03718791" Apr 28 19:20:19.041169 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.041152 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"280bce37e65283ecc036af4224a36cf2a57fe5fc2535712764d6b23a03718791"} err="failed to get container status \"280bce37e65283ecc036af4224a36cf2a57fe5fc2535712764d6b23a03718791\": rpc error: code = NotFound desc = could not find container \"280bce37e65283ecc036af4224a36cf2a57fe5fc2535712764d6b23a03718791\": container with ID starting with 280bce37e65283ecc036af4224a36cf2a57fe5fc2535712764d6b23a03718791 not found: ID does not exist" Apr 28 19:20:19.041238 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.041170 2565 scope.go:117] "RemoveContainer" containerID="5ed81fcd5d5104b0bbfa4d4b227934a838b1eefed8091befd55c59bed5b1f81a" Apr 28 19:20:19.041368 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.041345 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ed81fcd5d5104b0bbfa4d4b227934a838b1eefed8091befd55c59bed5b1f81a"} err="failed to get container status \"5ed81fcd5d5104b0bbfa4d4b227934a838b1eefed8091befd55c59bed5b1f81a\": rpc error: code = NotFound desc = could not find container \"5ed81fcd5d5104b0bbfa4d4b227934a838b1eefed8091befd55c59bed5b1f81a\": container with ID starting with 5ed81fcd5d5104b0bbfa4d4b227934a838b1eefed8091befd55c59bed5b1f81a not found: ID does not exist" Apr 28 19:20:19.041415 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.041370 2565 scope.go:117] "RemoveContainer" containerID="109fd527fd25608545c428f359deeb1c966f2e4b50ff469476da1415629a8e03" Apr 28 19:20:19.041584 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.041559 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"109fd527fd25608545c428f359deeb1c966f2e4b50ff469476da1415629a8e03"} err="failed to get container status \"109fd527fd25608545c428f359deeb1c966f2e4b50ff469476da1415629a8e03\": rpc error: code = NotFound desc = could not find container \"109fd527fd25608545c428f359deeb1c966f2e4b50ff469476da1415629a8e03\": container with ID starting with 109fd527fd25608545c428f359deeb1c966f2e4b50ff469476da1415629a8e03 not found: ID does not exist" Apr 28 19:20:19.041584 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.041583 2565 scope.go:117] "RemoveContainer" containerID="91faf80ce8795ed00a7cccdd26b58734c642baf415f51264d6a483e7071b3683" Apr 28 19:20:19.041799 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.041783 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91faf80ce8795ed00a7cccdd26b58734c642baf415f51264d6a483e7071b3683"} err="failed to get container status \"91faf80ce8795ed00a7cccdd26b58734c642baf415f51264d6a483e7071b3683\": rpc error: code = NotFound desc = could not find container \"91faf80ce8795ed00a7cccdd26b58734c642baf415f51264d6a483e7071b3683\": container with ID starting with 91faf80ce8795ed00a7cccdd26b58734c642baf415f51264d6a483e7071b3683 not found: ID does not exist" Apr 28 19:20:19.041843 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.041799 2565 scope.go:117] "RemoveContainer" containerID="b541a12388d6eac81a53ff1903770908557c7a5a6bea1e8d489709d95d10f2fe" Apr 28 19:20:19.042011 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.041994 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b541a12388d6eac81a53ff1903770908557c7a5a6bea1e8d489709d95d10f2fe"} err="failed to get container status \"b541a12388d6eac81a53ff1903770908557c7a5a6bea1e8d489709d95d10f2fe\": rpc error: code = NotFound desc = could not find container \"b541a12388d6eac81a53ff1903770908557c7a5a6bea1e8d489709d95d10f2fe\": container with ID starting with b541a12388d6eac81a53ff1903770908557c7a5a6bea1e8d489709d95d10f2fe not found: ID does not exist" Apr 28 19:20:19.042067 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.042011 2565 scope.go:117] "RemoveContainer" containerID="790e2919fa10ba5ec8afda0a7969d62bc9ae93389e2db1eb461ed48a8722f5a2" Apr 28 19:20:19.042219 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.042202 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"790e2919fa10ba5ec8afda0a7969d62bc9ae93389e2db1eb461ed48a8722f5a2"} err="failed to get container status \"790e2919fa10ba5ec8afda0a7969d62bc9ae93389e2db1eb461ed48a8722f5a2\": rpc error: code = NotFound desc = could not find container \"790e2919fa10ba5ec8afda0a7969d62bc9ae93389e2db1eb461ed48a8722f5a2\": container with ID starting with 790e2919fa10ba5ec8afda0a7969d62bc9ae93389e2db1eb461ed48a8722f5a2 not found: ID does not exist" Apr 28 19:20:19.042219 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.042218 2565 scope.go:117] "RemoveContainer" containerID="6df1f261365589fef57315b77b2b9254bb29c53956289ab8c2897ee5d714d186" Apr 28 19:20:19.042411 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.042396 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6df1f261365589fef57315b77b2b9254bb29c53956289ab8c2897ee5d714d186"} err="failed to get container status \"6df1f261365589fef57315b77b2b9254bb29c53956289ab8c2897ee5d714d186\": rpc error: code = NotFound desc = could not find container \"6df1f261365589fef57315b77b2b9254bb29c53956289ab8c2897ee5d714d186\": container with ID starting with 6df1f261365589fef57315b77b2b9254bb29c53956289ab8c2897ee5d714d186 not found: ID does not exist" Apr 28 19:20:19.042411 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.042410 2565 scope.go:117] "RemoveContainer" containerID="280bce37e65283ecc036af4224a36cf2a57fe5fc2535712764d6b23a03718791" Apr 28 19:20:19.042579 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.042564 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"280bce37e65283ecc036af4224a36cf2a57fe5fc2535712764d6b23a03718791"} err="failed to get container status \"280bce37e65283ecc036af4224a36cf2a57fe5fc2535712764d6b23a03718791\": rpc error: code = NotFound desc = could not find container \"280bce37e65283ecc036af4224a36cf2a57fe5fc2535712764d6b23a03718791\": container with ID starting with 280bce37e65283ecc036af4224a36cf2a57fe5fc2535712764d6b23a03718791 not found: ID does not exist" Apr 28 19:20:19.042579 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.042578 2565 scope.go:117] "RemoveContainer" containerID="5ed81fcd5d5104b0bbfa4d4b227934a838b1eefed8091befd55c59bed5b1f81a" Apr 28 19:20:19.042757 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.042741 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ed81fcd5d5104b0bbfa4d4b227934a838b1eefed8091befd55c59bed5b1f81a"} err="failed to get container status \"5ed81fcd5d5104b0bbfa4d4b227934a838b1eefed8091befd55c59bed5b1f81a\": rpc error: code = NotFound desc = could not find container \"5ed81fcd5d5104b0bbfa4d4b227934a838b1eefed8091befd55c59bed5b1f81a\": container with ID starting with 5ed81fcd5d5104b0bbfa4d4b227934a838b1eefed8091befd55c59bed5b1f81a not found: ID does not exist" Apr 28 19:20:19.042757 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.042756 2565 scope.go:117] "RemoveContainer" containerID="109fd527fd25608545c428f359deeb1c966f2e4b50ff469476da1415629a8e03" Apr 28 19:20:19.042926 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.042892 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"109fd527fd25608545c428f359deeb1c966f2e4b50ff469476da1415629a8e03"} err="failed to get container status \"109fd527fd25608545c428f359deeb1c966f2e4b50ff469476da1415629a8e03\": rpc error: code = NotFound desc = could not find container \"109fd527fd25608545c428f359deeb1c966f2e4b50ff469476da1415629a8e03\": container with ID starting with 109fd527fd25608545c428f359deeb1c966f2e4b50ff469476da1415629a8e03 not found: ID does not exist" Apr 28 19:20:19.042926 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.042925 2565 scope.go:117] "RemoveContainer" containerID="91faf80ce8795ed00a7cccdd26b58734c642baf415f51264d6a483e7071b3683" Apr 28 19:20:19.043111 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.043094 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91faf80ce8795ed00a7cccdd26b58734c642baf415f51264d6a483e7071b3683"} err="failed to get container status \"91faf80ce8795ed00a7cccdd26b58734c642baf415f51264d6a483e7071b3683\": rpc error: code = NotFound desc = could not find container \"91faf80ce8795ed00a7cccdd26b58734c642baf415f51264d6a483e7071b3683\": container with ID starting with 91faf80ce8795ed00a7cccdd26b58734c642baf415f51264d6a483e7071b3683 not found: ID does not exist" Apr 28 19:20:19.043175 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.043112 2565 scope.go:117] "RemoveContainer" containerID="b541a12388d6eac81a53ff1903770908557c7a5a6bea1e8d489709d95d10f2fe" Apr 28 19:20:19.043343 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.043321 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b541a12388d6eac81a53ff1903770908557c7a5a6bea1e8d489709d95d10f2fe"} err="failed to get container status \"b541a12388d6eac81a53ff1903770908557c7a5a6bea1e8d489709d95d10f2fe\": rpc error: code = NotFound desc = could not find container \"b541a12388d6eac81a53ff1903770908557c7a5a6bea1e8d489709d95d10f2fe\": container with ID starting with b541a12388d6eac81a53ff1903770908557c7a5a6bea1e8d489709d95d10f2fe not found: ID does not exist" Apr 28 19:20:19.043382 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.043345 2565 scope.go:117] "RemoveContainer" containerID="790e2919fa10ba5ec8afda0a7969d62bc9ae93389e2db1eb461ed48a8722f5a2" Apr 28 19:20:19.043484 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.043469 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.043560 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.043540 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"790e2919fa10ba5ec8afda0a7969d62bc9ae93389e2db1eb461ed48a8722f5a2"} err="failed to get container status \"790e2919fa10ba5ec8afda0a7969d62bc9ae93389e2db1eb461ed48a8722f5a2\": rpc error: code = NotFound desc = could not find container \"790e2919fa10ba5ec8afda0a7969d62bc9ae93389e2db1eb461ed48a8722f5a2\": container with ID starting with 790e2919fa10ba5ec8afda0a7969d62bc9ae93389e2db1eb461ed48a8722f5a2 not found: ID does not exist" Apr 28 19:20:19.043603 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.043562 2565 scope.go:117] "RemoveContainer" containerID="6df1f261365589fef57315b77b2b9254bb29c53956289ab8c2897ee5d714d186" Apr 28 19:20:19.043846 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.043806 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6df1f261365589fef57315b77b2b9254bb29c53956289ab8c2897ee5d714d186"} err="failed to get container status \"6df1f261365589fef57315b77b2b9254bb29c53956289ab8c2897ee5d714d186\": rpc error: code = NotFound desc = could not find container \"6df1f261365589fef57315b77b2b9254bb29c53956289ab8c2897ee5d714d186\": container with ID starting with 6df1f261365589fef57315b77b2b9254bb29c53956289ab8c2897ee5d714d186 not found: ID does not exist" Apr 28 19:20:19.043990 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.043848 2565 scope.go:117] "RemoveContainer" containerID="280bce37e65283ecc036af4224a36cf2a57fe5fc2535712764d6b23a03718791" Apr 28 19:20:19.044134 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.044114 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"280bce37e65283ecc036af4224a36cf2a57fe5fc2535712764d6b23a03718791"} err="failed to get container status \"280bce37e65283ecc036af4224a36cf2a57fe5fc2535712764d6b23a03718791\": rpc error: code = NotFound desc = could not find container \"280bce37e65283ecc036af4224a36cf2a57fe5fc2535712764d6b23a03718791\": container with ID starting with 280bce37e65283ecc036af4224a36cf2a57fe5fc2535712764d6b23a03718791 not found: ID does not exist" Apr 28 19:20:19.044189 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.044134 2565 scope.go:117] "RemoveContainer" containerID="5ed81fcd5d5104b0bbfa4d4b227934a838b1eefed8091befd55c59bed5b1f81a" Apr 28 19:20:19.044358 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.044340 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ed81fcd5d5104b0bbfa4d4b227934a838b1eefed8091befd55c59bed5b1f81a"} err="failed to get container status \"5ed81fcd5d5104b0bbfa4d4b227934a838b1eefed8091befd55c59bed5b1f81a\": rpc error: code = NotFound desc = could not find container \"5ed81fcd5d5104b0bbfa4d4b227934a838b1eefed8091befd55c59bed5b1f81a\": container with ID starting with 5ed81fcd5d5104b0bbfa4d4b227934a838b1eefed8091befd55c59bed5b1f81a not found: ID does not exist" Apr 28 19:20:19.047934 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.047913 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 28 19:20:19.048168 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.048149 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 28 19:20:19.048463 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.048303 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 28 19:20:19.048463 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.048171 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-1oeck03afes0m\"" Apr 28 19:20:19.048463 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.048225 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 28 19:20:19.048463 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.048246 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 28 19:20:19.048463 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.048417 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 28 19:20:19.048776 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.048667 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 28 19:20:19.048776 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.048698 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 28 19:20:19.049102 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.049087 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 28 19:20:19.049234 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.049200 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 28 19:20:19.049388 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.049373 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 28 19:20:19.049443 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.049430 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-n8tqr\"" Apr 28 19:20:19.052577 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.052558 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 28 19:20:19.056159 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.056143 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 28 19:20:19.058760 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.058741 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 28 19:20:19.123779 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.123744 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cda13ef8-dc6e-4c55-9449-4617c04114b5" path="/var/lib/kubelet/pods/cda13ef8-dc6e-4c55-9449-4617c04114b5/volumes" Apr 28 19:20:19.141466 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.141439 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fa52a0b0-acb7-4163-afe1-bdacb48696a2-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.141564 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.141472 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/fa52a0b0-acb7-4163-afe1-bdacb48696a2-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.141564 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.141500 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fa52a0b0-acb7-4163-afe1-bdacb48696a2-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.141564 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.141550 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgsnd\" (UniqueName: \"kubernetes.io/projected/fa52a0b0-acb7-4163-afe1-bdacb48696a2-kube-api-access-bgsnd\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.141725 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.141580 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fa52a0b0-acb7-4163-afe1-bdacb48696a2-web-config\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.141725 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.141689 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/fa52a0b0-acb7-4163-afe1-bdacb48696a2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.141826 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.141722 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fa52a0b0-acb7-4163-afe1-bdacb48696a2-config-out\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.141826 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.141749 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa52a0b0-acb7-4163-afe1-bdacb48696a2-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.141826 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.141782 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fa52a0b0-acb7-4163-afe1-bdacb48696a2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.141826 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.141815 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fa52a0b0-acb7-4163-afe1-bdacb48696a2-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.141988 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.141868 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/fa52a0b0-acb7-4163-afe1-bdacb48696a2-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.141988 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.141925 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fa52a0b0-acb7-4163-afe1-bdacb48696a2-config\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.141988 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.141952 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fa52a0b0-acb7-4163-afe1-bdacb48696a2-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.141988 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.141976 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa52a0b0-acb7-4163-afe1-bdacb48696a2-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.142144 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.141993 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa52a0b0-acb7-4163-afe1-bdacb48696a2-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.142144 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.142011 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fa52a0b0-acb7-4163-afe1-bdacb48696a2-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.142144 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.142031 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/fa52a0b0-acb7-4163-afe1-bdacb48696a2-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.142144 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.142068 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fa52a0b0-acb7-4163-afe1-bdacb48696a2-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.242590 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.242523 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/fa52a0b0-acb7-4163-afe1-bdacb48696a2-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.242590 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.242571 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fa52a0b0-acb7-4163-afe1-bdacb48696a2-config\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.242590 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.242590 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fa52a0b0-acb7-4163-afe1-bdacb48696a2-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.243658 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.242619 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa52a0b0-acb7-4163-afe1-bdacb48696a2-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.243658 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.242638 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa52a0b0-acb7-4163-afe1-bdacb48696a2-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.243658 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.242652 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fa52a0b0-acb7-4163-afe1-bdacb48696a2-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.243658 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.242687 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/fa52a0b0-acb7-4163-afe1-bdacb48696a2-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.243658 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.242715 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fa52a0b0-acb7-4163-afe1-bdacb48696a2-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.243658 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.242745 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fa52a0b0-acb7-4163-afe1-bdacb48696a2-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.243658 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.242769 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/fa52a0b0-acb7-4163-afe1-bdacb48696a2-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.243658 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.242794 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fa52a0b0-acb7-4163-afe1-bdacb48696a2-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.243658 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.242821 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bgsnd\" (UniqueName: \"kubernetes.io/projected/fa52a0b0-acb7-4163-afe1-bdacb48696a2-kube-api-access-bgsnd\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.243658 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.242851 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fa52a0b0-acb7-4163-afe1-bdacb48696a2-web-config\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.243658 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.242917 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/fa52a0b0-acb7-4163-afe1-bdacb48696a2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.243658 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.242952 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fa52a0b0-acb7-4163-afe1-bdacb48696a2-config-out\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.243658 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.243255 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/fa52a0b0-acb7-4163-afe1-bdacb48696a2-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.243658 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.243304 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa52a0b0-acb7-4163-afe1-bdacb48696a2-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.243658 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.243338 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fa52a0b0-acb7-4163-afe1-bdacb48696a2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.243658 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.243377 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fa52a0b0-acb7-4163-afe1-bdacb48696a2-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.248997 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.245676 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa52a0b0-acb7-4163-afe1-bdacb48696a2-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.248997 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.246694 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fa52a0b0-acb7-4163-afe1-bdacb48696a2-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.248997 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.247378 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fa52a0b0-acb7-4163-afe1-bdacb48696a2-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.248997 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.247451 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fa52a0b0-acb7-4163-afe1-bdacb48696a2-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.248997 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.247927 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fa52a0b0-acb7-4163-afe1-bdacb48696a2-web-config\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.248997 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.248050 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa52a0b0-acb7-4163-afe1-bdacb48696a2-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.248997 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.248183 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/fa52a0b0-acb7-4163-afe1-bdacb48696a2-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.248997 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.248405 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa52a0b0-acb7-4163-afe1-bdacb48696a2-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.248997 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.248763 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/fa52a0b0-acb7-4163-afe1-bdacb48696a2-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.249776 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.249290 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fa52a0b0-acb7-4163-afe1-bdacb48696a2-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.249776 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.249763 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/fa52a0b0-acb7-4163-afe1-bdacb48696a2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.249914 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.249861 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fa52a0b0-acb7-4163-afe1-bdacb48696a2-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.250564 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.250450 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fa52a0b0-acb7-4163-afe1-bdacb48696a2-config\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.250564 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.250496 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fa52a0b0-acb7-4163-afe1-bdacb48696a2-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.251667 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.251638 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fa52a0b0-acb7-4163-afe1-bdacb48696a2-config-out\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.252041 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.252021 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fa52a0b0-acb7-4163-afe1-bdacb48696a2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.256423 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.256403 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgsnd\" (UniqueName: \"kubernetes.io/projected/fa52a0b0-acb7-4163-afe1-bdacb48696a2-kube-api-access-bgsnd\") pod \"prometheus-k8s-0\" (UID: \"fa52a0b0-acb7-4163-afe1-bdacb48696a2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.352730 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.352701 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:19.480562 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.480533 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 28 19:20:19.483990 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:20:19.483963 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa52a0b0_acb7_4163_afe1_bdacb48696a2.slice/crio-e76702dc203396f05db3e0946e85deed384acecfab1a04a4eb71a8cac650d625 WatchSource:0}: Error finding container e76702dc203396f05db3e0946e85deed384acecfab1a04a4eb71a8cac650d625: Status 404 returned error can't find the container with id e76702dc203396f05db3e0946e85deed384acecfab1a04a4eb71a8cac650d625 Apr 28 19:20:19.987773 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.987734 2565 generic.go:358] "Generic (PLEG): container finished" podID="fa52a0b0-acb7-4163-afe1-bdacb48696a2" containerID="bd3c609bb59622b07a30ef90605420296bac753cbfc9f8b1dc33aa1a9bd52e67" exitCode=0 Apr 28 19:20:19.987957 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.987824 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fa52a0b0-acb7-4163-afe1-bdacb48696a2","Type":"ContainerDied","Data":"bd3c609bb59622b07a30ef90605420296bac753cbfc9f8b1dc33aa1a9bd52e67"} Apr 28 19:20:19.987957 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:19.987857 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fa52a0b0-acb7-4163-afe1-bdacb48696a2","Type":"ContainerStarted","Data":"e76702dc203396f05db3e0946e85deed384acecfab1a04a4eb71a8cac650d625"} Apr 28 19:20:20.995234 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:20.995201 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fa52a0b0-acb7-4163-afe1-bdacb48696a2","Type":"ContainerStarted","Data":"b393bb0f081271b0dfba2f2126c3c41a11981f9a702663d0764d9a651a91e3d7"} Apr 28 19:20:20.995234 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:20.995235 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fa52a0b0-acb7-4163-afe1-bdacb48696a2","Type":"ContainerStarted","Data":"d4c59429fadf0afeda9e03f59f55a5b743ea3f068f448b88333963237b4e26c8"} Apr 28 19:20:20.995616 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:20.995246 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fa52a0b0-acb7-4163-afe1-bdacb48696a2","Type":"ContainerStarted","Data":"d6e47796104559ac47403075072c80ba5bcab52ceb949dbc731398dfdc50d0a5"} Apr 28 19:20:20.995616 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:20.995255 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fa52a0b0-acb7-4163-afe1-bdacb48696a2","Type":"ContainerStarted","Data":"33c592c14f9414242367a81517f8a54801a5506e617860d563e538859032423e"} Apr 28 19:20:20.995616 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:20.995263 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fa52a0b0-acb7-4163-afe1-bdacb48696a2","Type":"ContainerStarted","Data":"572a61bf00547eb834de1ff83c8174c4d4ff3561bda58e2f034349011dce6fa0"} Apr 28 19:20:20.995616 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:20.995272 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fa52a0b0-acb7-4163-afe1-bdacb48696a2","Type":"ContainerStarted","Data":"1a8763282b91728918caf4d3091b19a39f360f222dc88de0856bffba912fb69c"} Apr 28 19:20:21.023523 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:21.023478 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.023464965 podStartE2EDuration="2.023464965s" podCreationTimestamp="2026-04-28 19:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:20:21.022006929 +0000 UTC m=+278.421752006" watchObservedRunningTime="2026-04-28 19:20:21.023464965 +0000 UTC m=+278.423210108" Apr 28 19:20:24.352967 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:24.352933 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:43.039023 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:43.038991 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-4h69b_8bf92c20-e551-497f-995e-ea716db91e5d/cluster-monitoring-operator/0.log" Apr 28 19:20:43.039433 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:43.039381 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-4h69b_8bf92c20-e551-497f-995e-ea716db91e5d/cluster-monitoring-operator/0.log" Apr 28 19:20:43.042655 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:43.042634 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z897l_58244bb4-99cf-41d7-91d2-e3c4ffe45e20/console-operator/2.log" Apr 28 19:20:43.043014 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:43.042997 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z897l_58244bb4-99cf-41d7-91d2-e3c4ffe45e20/console-operator/2.log" Apr 28 19:20:43.053291 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:20:43.053274 2565 kubelet.go:1628] "Image garbage collection succeeded" Apr 28 19:21:19.352913 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:21:19.352863 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:19.367727 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:21:19.367701 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:21:20.186140 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:21:20.186113 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:23:51.109656 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:23:51.109620 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-848pf"] Apr 28 19:23:51.112726 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:23:51.112708 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-848pf" Apr 28 19:23:51.115752 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:23:51.115722 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-vtdql\"" Apr 28 19:23:51.115886 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:23:51.115822 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 28 19:23:51.116152 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:23:51.116136 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 28 19:23:51.116580 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:23:51.116562 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 28 19:23:51.128892 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:23:51.128867 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-848pf"] Apr 28 19:23:51.144734 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:23:51.144713 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jqld\" (UniqueName: \"kubernetes.io/projected/073a3b42-7ec9-436b-8c21-a22eb0ff2c83-kube-api-access-5jqld\") pod \"s3-init-848pf\" (UID: \"073a3b42-7ec9-436b-8c21-a22eb0ff2c83\") " pod="kserve/s3-init-848pf" Apr 28 19:23:51.245301 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:23:51.245274 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5jqld\" (UniqueName: \"kubernetes.io/projected/073a3b42-7ec9-436b-8c21-a22eb0ff2c83-kube-api-access-5jqld\") pod \"s3-init-848pf\" (UID: \"073a3b42-7ec9-436b-8c21-a22eb0ff2c83\") " pod="kserve/s3-init-848pf" Apr 28 19:23:51.254815 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:23:51.254795 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jqld\" (UniqueName: \"kubernetes.io/projected/073a3b42-7ec9-436b-8c21-a22eb0ff2c83-kube-api-access-5jqld\") pod \"s3-init-848pf\" (UID: \"073a3b42-7ec9-436b-8c21-a22eb0ff2c83\") " pod="kserve/s3-init-848pf" Apr 28 19:23:51.431211 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:23:51.431155 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-848pf" Apr 28 19:23:51.545960 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:23:51.545932 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-848pf"] Apr 28 19:23:51.548365 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:23:51.548331 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod073a3b42_7ec9_436b_8c21_a22eb0ff2c83.slice/crio-d6125631bc74979822cf7db2fe54a22ac48c64f6727cbdb132d6127ab80d7478 WatchSource:0}: Error finding container d6125631bc74979822cf7db2fe54a22ac48c64f6727cbdb132d6127ab80d7478: Status 404 returned error can't find the container with id d6125631bc74979822cf7db2fe54a22ac48c64f6727cbdb132d6127ab80d7478 Apr 28 19:23:51.550253 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:23:51.550239 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:23:51.594491 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:23:51.594470 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-848pf" event={"ID":"073a3b42-7ec9-436b-8c21-a22eb0ff2c83","Type":"ContainerStarted","Data":"d6125631bc74979822cf7db2fe54a22ac48c64f6727cbdb132d6127ab80d7478"} Apr 28 19:23:56.610822 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:23:56.610783 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-848pf" event={"ID":"073a3b42-7ec9-436b-8c21-a22eb0ff2c83","Type":"ContainerStarted","Data":"38fa83078cec71a85eb2ea4b6fb0447f1e9a8354964a0eef7889af0bae369e77"} Apr 28 19:23:56.626561 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:23:56.626513 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-848pf" podStartSLOduration=1.109795884 podStartE2EDuration="5.626499895s" podCreationTimestamp="2026-04-28 19:23:51 +0000 UTC" firstStartedPulling="2026-04-28 19:23:51.550374439 +0000 UTC m=+488.950119495" lastFinishedPulling="2026-04-28 19:23:56.067078436 +0000 UTC m=+493.466823506" observedRunningTime="2026-04-28 19:23:56.625330551 +0000 UTC m=+494.025075629" watchObservedRunningTime="2026-04-28 19:23:56.626499895 +0000 UTC m=+494.026244973" Apr 28 19:23:59.621784 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:23:59.621755 2565 generic.go:358] "Generic (PLEG): container finished" podID="073a3b42-7ec9-436b-8c21-a22eb0ff2c83" containerID="38fa83078cec71a85eb2ea4b6fb0447f1e9a8354964a0eef7889af0bae369e77" exitCode=0 Apr 28 19:23:59.622172 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:23:59.621817 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-848pf" event={"ID":"073a3b42-7ec9-436b-8c21-a22eb0ff2c83","Type":"ContainerDied","Data":"38fa83078cec71a85eb2ea4b6fb0447f1e9a8354964a0eef7889af0bae369e77"} Apr 28 19:24:00.748803 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:24:00.748782 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-848pf" Apr 28 19:24:00.829161 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:24:00.829132 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jqld\" (UniqueName: \"kubernetes.io/projected/073a3b42-7ec9-436b-8c21-a22eb0ff2c83-kube-api-access-5jqld\") pod \"073a3b42-7ec9-436b-8c21-a22eb0ff2c83\" (UID: \"073a3b42-7ec9-436b-8c21-a22eb0ff2c83\") " Apr 28 19:24:00.831170 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:24:00.831140 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/073a3b42-7ec9-436b-8c21-a22eb0ff2c83-kube-api-access-5jqld" (OuterVolumeSpecName: "kube-api-access-5jqld") pod "073a3b42-7ec9-436b-8c21-a22eb0ff2c83" (UID: "073a3b42-7ec9-436b-8c21-a22eb0ff2c83"). InnerVolumeSpecName "kube-api-access-5jqld". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:24:00.930244 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:24:00.930186 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5jqld\" (UniqueName: \"kubernetes.io/projected/073a3b42-7ec9-436b-8c21-a22eb0ff2c83-kube-api-access-5jqld\") on node \"ip-10-0-138-119.ec2.internal\" DevicePath \"\"" Apr 28 19:24:01.628305 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:24:01.628278 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-848pf" Apr 28 19:24:01.628305 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:24:01.628286 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-848pf" event={"ID":"073a3b42-7ec9-436b-8c21-a22eb0ff2c83","Type":"ContainerDied","Data":"d6125631bc74979822cf7db2fe54a22ac48c64f6727cbdb132d6127ab80d7478"} Apr 28 19:24:01.628492 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:24:01.628316 2565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6125631bc74979822cf7db2fe54a22ac48c64f6727cbdb132d6127ab80d7478" Apr 28 19:25:43.063202 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:25:43.063172 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-4h69b_8bf92c20-e551-497f-995e-ea716db91e5d/cluster-monitoring-operator/0.log" Apr 28 19:25:43.064143 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:25:43.064120 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-4h69b_8bf92c20-e551-497f-995e-ea716db91e5d/cluster-monitoring-operator/0.log" Apr 28 19:25:43.066929 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:25:43.066885 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z897l_58244bb4-99cf-41d7-91d2-e3c4ffe45e20/console-operator/2.log" Apr 28 19:25:43.067771 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:25:43.067748 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z897l_58244bb4-99cf-41d7-91d2-e3c4ffe45e20/console-operator/2.log" Apr 28 19:30:43.086956 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:30:43.086877 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-4h69b_8bf92c20-e551-497f-995e-ea716db91e5d/cluster-monitoring-operator/0.log" Apr 28 19:30:43.088150 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:30:43.088125 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-4h69b_8bf92c20-e551-497f-995e-ea716db91e5d/cluster-monitoring-operator/0.log" Apr 28 19:30:43.090512 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:30:43.090486 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z897l_58244bb4-99cf-41d7-91d2-e3c4ffe45e20/console-operator/2.log" Apr 28 19:30:43.091610 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:30:43.091587 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z897l_58244bb4-99cf-41d7-91d2-e3c4ffe45e20/console-operator/2.log" Apr 28 19:35:43.110348 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:35:43.110318 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-4h69b_8bf92c20-e551-497f-995e-ea716db91e5d/cluster-monitoring-operator/0.log" Apr 28 19:35:43.112589 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:35:43.112558 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-4h69b_8bf92c20-e551-497f-995e-ea716db91e5d/cluster-monitoring-operator/0.log" Apr 28 19:35:43.113946 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:35:43.113927 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z897l_58244bb4-99cf-41d7-91d2-e3c4ffe45e20/console-operator/2.log" Apr 28 19:35:43.116347 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:35:43.116325 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z897l_58244bb4-99cf-41d7-91d2-e3c4ffe45e20/console-operator/2.log" Apr 28 19:37:59.559218 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:37:59.559187 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qh94g/must-gather-jpzz6"] Apr 28 19:37:59.561488 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:37:59.559536 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="073a3b42-7ec9-436b-8c21-a22eb0ff2c83" containerName="s3-init" Apr 28 19:37:59.561488 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:37:59.559547 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="073a3b42-7ec9-436b-8c21-a22eb0ff2c83" containerName="s3-init" Apr 28 19:37:59.561488 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:37:59.559610 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="073a3b42-7ec9-436b-8c21-a22eb0ff2c83" containerName="s3-init" Apr 28 19:37:59.562215 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:37:59.562199 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qh94g/must-gather-jpzz6" Apr 28 19:37:59.564124 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:37:59.564103 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-qh94g\"/\"openshift-service-ca.crt\"" Apr 28 19:37:59.564668 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:37:59.564649 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-qh94g\"/\"kube-root-ca.crt\"" Apr 28 19:37:59.564668 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:37:59.564659 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-qh94g\"/\"default-dockercfg-ggtwx\"" Apr 28 19:37:59.568767 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:37:59.568734 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qh94g/must-gather-jpzz6"] Apr 28 19:37:59.688723 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:37:59.688685 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkssf\" (UniqueName: \"kubernetes.io/projected/20605bf2-d540-4c49-8a0a-b74f99e33315-kube-api-access-qkssf\") pod \"must-gather-jpzz6\" (UID: \"20605bf2-d540-4c49-8a0a-b74f99e33315\") " pod="openshift-must-gather-qh94g/must-gather-jpzz6" Apr 28 19:37:59.688921 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:37:59.688787 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/20605bf2-d540-4c49-8a0a-b74f99e33315-must-gather-output\") pod \"must-gather-jpzz6\" (UID: \"20605bf2-d540-4c49-8a0a-b74f99e33315\") " pod="openshift-must-gather-qh94g/must-gather-jpzz6" Apr 28 19:37:59.790043 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:37:59.790007 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/20605bf2-d540-4c49-8a0a-b74f99e33315-must-gather-output\") pod \"must-gather-jpzz6\" (UID: \"20605bf2-d540-4c49-8a0a-b74f99e33315\") " pod="openshift-must-gather-qh94g/must-gather-jpzz6" Apr 28 19:37:59.790135 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:37:59.790072 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qkssf\" (UniqueName: \"kubernetes.io/projected/20605bf2-d540-4c49-8a0a-b74f99e33315-kube-api-access-qkssf\") pod \"must-gather-jpzz6\" (UID: \"20605bf2-d540-4c49-8a0a-b74f99e33315\") " pod="openshift-must-gather-qh94g/must-gather-jpzz6" Apr 28 19:37:59.790392 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:37:59.790368 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/20605bf2-d540-4c49-8a0a-b74f99e33315-must-gather-output\") pod \"must-gather-jpzz6\" (UID: \"20605bf2-d540-4c49-8a0a-b74f99e33315\") " pod="openshift-must-gather-qh94g/must-gather-jpzz6" Apr 28 19:37:59.797755 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:37:59.797723 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkssf\" (UniqueName: \"kubernetes.io/projected/20605bf2-d540-4c49-8a0a-b74f99e33315-kube-api-access-qkssf\") pod \"must-gather-jpzz6\" (UID: \"20605bf2-d540-4c49-8a0a-b74f99e33315\") " pod="openshift-must-gather-qh94g/must-gather-jpzz6" Apr 28 19:37:59.882188 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:37:59.882100 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qh94g/must-gather-jpzz6" Apr 28 19:37:59.998593 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:37:59.998555 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qh94g/must-gather-jpzz6"] Apr 28 19:38:00.001562 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:38:00.001534 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20605bf2_d540_4c49_8a0a_b74f99e33315.slice/crio-7b167f00d244c24779616925dee2de7b3e64fc6d73a44643c40071fcb61b8c88 WatchSource:0}: Error finding container 7b167f00d244c24779616925dee2de7b3e64fc6d73a44643c40071fcb61b8c88: Status 404 returned error can't find the container with id 7b167f00d244c24779616925dee2de7b3e64fc6d73a44643c40071fcb61b8c88 Apr 28 19:38:00.003346 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:00.003330 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:38:00.060985 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:00.060948 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qh94g/must-gather-jpzz6" event={"ID":"20605bf2-d540-4c49-8a0a-b74f99e33315","Type":"ContainerStarted","Data":"7b167f00d244c24779616925dee2de7b3e64fc6d73a44643c40071fcb61b8c88"} Apr 28 19:38:06.082787 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:06.082751 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qh94g/must-gather-jpzz6" event={"ID":"20605bf2-d540-4c49-8a0a-b74f99e33315","Type":"ContainerStarted","Data":"92241f672ffab2df4a764cc59b3a75612c9874d854909e661dc72e00e95b8470"} Apr 28 19:38:06.082787 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:06.082793 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qh94g/must-gather-jpzz6" event={"ID":"20605bf2-d540-4c49-8a0a-b74f99e33315","Type":"ContainerStarted","Data":"c91e6dc2c4b2bcb5783878b0343351034b7d5b7649d926f7784022c51e9c99c1"} Apr 28 19:38:06.099219 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:06.099153 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qh94g/must-gather-jpzz6" podStartSLOduration=2.037479011 podStartE2EDuration="7.099133582s" podCreationTimestamp="2026-04-28 19:37:59 +0000 UTC" firstStartedPulling="2026-04-28 19:38:00.003455107 +0000 UTC m=+1337.403200162" lastFinishedPulling="2026-04-28 19:38:05.065109673 +0000 UTC m=+1342.464854733" observedRunningTime="2026-04-28 19:38:06.096780375 +0000 UTC m=+1343.496525452" watchObservedRunningTime="2026-04-28 19:38:06.099133582 +0000 UTC m=+1343.498878662" Apr 28 19:38:23.141661 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:23.141625 2565 generic.go:358] "Generic (PLEG): container finished" podID="20605bf2-d540-4c49-8a0a-b74f99e33315" containerID="c91e6dc2c4b2bcb5783878b0343351034b7d5b7649d926f7784022c51e9c99c1" exitCode=0 Apr 28 19:38:23.142100 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:23.141696 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qh94g/must-gather-jpzz6" event={"ID":"20605bf2-d540-4c49-8a0a-b74f99e33315","Type":"ContainerDied","Data":"c91e6dc2c4b2bcb5783878b0343351034b7d5b7649d926f7784022c51e9c99c1"} Apr 28 19:38:23.142100 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:23.142039 2565 scope.go:117] "RemoveContainer" containerID="c91e6dc2c4b2bcb5783878b0343351034b7d5b7649d926f7784022c51e9c99c1" Apr 28 19:38:23.229532 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:23.229474 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qh94g_must-gather-jpzz6_20605bf2-d540-4c49-8a0a-b74f99e33315/gather/0.log" Apr 28 19:38:26.474114 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:26.474074 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-ncgxg_1059a8ad-7584-4de3-8259-c624717ec350/global-pull-secret-syncer/0.log" Apr 28 19:38:26.580243 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:26.580209 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-dn4qh_4a28734c-75dc-4444-ae1f-b70d31a241e2/konnectivity-agent/0.log" Apr 28 19:38:26.708599 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:26.708558 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-119.ec2.internal_9d1685eafe28745f79356d1935bdc8f9/haproxy/0.log" Apr 28 19:38:28.596032 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:28.595995 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qh94g/must-gather-jpzz6"] Apr 28 19:38:28.596404 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:28.596219 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-qh94g/must-gather-jpzz6" podUID="20605bf2-d540-4c49-8a0a-b74f99e33315" containerName="copy" containerID="cri-o://92241f672ffab2df4a764cc59b3a75612c9874d854909e661dc72e00e95b8470" gracePeriod=2 Apr 28 19:38:28.598087 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:28.598054 2565 status_manager.go:895] "Failed to get status for pod" podUID="20605bf2-d540-4c49-8a0a-b74f99e33315" pod="openshift-must-gather-qh94g/must-gather-jpzz6" err="pods \"must-gather-jpzz6\" is forbidden: User \"system:node:ip-10-0-138-119.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-qh94g\": no relationship found between node 'ip-10-0-138-119.ec2.internal' and this object" Apr 28 19:38:28.598848 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:28.598820 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qh94g/must-gather-jpzz6"] Apr 28 19:38:28.823703 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:28.823679 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qh94g_must-gather-jpzz6_20605bf2-d540-4c49-8a0a-b74f99e33315/copy/0.log" Apr 28 19:38:28.824047 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:28.824031 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qh94g/must-gather-jpzz6" Apr 28 19:38:28.825562 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:28.825539 2565 status_manager.go:895] "Failed to get status for pod" podUID="20605bf2-d540-4c49-8a0a-b74f99e33315" pod="openshift-must-gather-qh94g/must-gather-jpzz6" err="pods \"must-gather-jpzz6\" is forbidden: User \"system:node:ip-10-0-138-119.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-qh94g\": no relationship found between node 'ip-10-0-138-119.ec2.internal' and this object" Apr 28 19:38:28.839810 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:28.839789 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkssf\" (UniqueName: \"kubernetes.io/projected/20605bf2-d540-4c49-8a0a-b74f99e33315-kube-api-access-qkssf\") pod \"20605bf2-d540-4c49-8a0a-b74f99e33315\" (UID: \"20605bf2-d540-4c49-8a0a-b74f99e33315\") " Apr 28 19:38:28.839869 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:28.839859 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/20605bf2-d540-4c49-8a0a-b74f99e33315-must-gather-output\") pod \"20605bf2-d540-4c49-8a0a-b74f99e33315\" (UID: \"20605bf2-d540-4c49-8a0a-b74f99e33315\") " Apr 28 19:38:28.841218 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:28.841193 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20605bf2-d540-4c49-8a0a-b74f99e33315-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "20605bf2-d540-4c49-8a0a-b74f99e33315" (UID: "20605bf2-d540-4c49-8a0a-b74f99e33315"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:38:28.841893 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:28.841875 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20605bf2-d540-4c49-8a0a-b74f99e33315-kube-api-access-qkssf" (OuterVolumeSpecName: "kube-api-access-qkssf") pod "20605bf2-d540-4c49-8a0a-b74f99e33315" (UID: "20605bf2-d540-4c49-8a0a-b74f99e33315"). InnerVolumeSpecName "kube-api-access-qkssf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:38:28.940660 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:28.940613 2565 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/20605bf2-d540-4c49-8a0a-b74f99e33315-must-gather-output\") on node \"ip-10-0-138-119.ec2.internal\" DevicePath \"\"" Apr 28 19:38:28.940660 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:28.940656 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qkssf\" (UniqueName: \"kubernetes.io/projected/20605bf2-d540-4c49-8a0a-b74f99e33315-kube-api-access-qkssf\") on node \"ip-10-0-138-119.ec2.internal\" DevicePath \"\"" Apr 28 19:38:29.125040 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:29.124959 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20605bf2-d540-4c49-8a0a-b74f99e33315" path="/var/lib/kubelet/pods/20605bf2-d540-4c49-8a0a-b74f99e33315/volumes" Apr 28 19:38:29.161864 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:29.161834 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qh94g_must-gather-jpzz6_20605bf2-d540-4c49-8a0a-b74f99e33315/copy/0.log" Apr 28 19:38:29.162188 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:29.162165 2565 generic.go:358] "Generic (PLEG): container finished" podID="20605bf2-d540-4c49-8a0a-b74f99e33315" containerID="92241f672ffab2df4a764cc59b3a75612c9874d854909e661dc72e00e95b8470" exitCode=143 Apr 28 19:38:29.162255 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:29.162220 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qh94g/must-gather-jpzz6" Apr 28 19:38:29.162292 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:29.162278 2565 scope.go:117] "RemoveContainer" containerID="92241f672ffab2df4a764cc59b3a75612c9874d854909e661dc72e00e95b8470" Apr 28 19:38:29.170304 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:29.170280 2565 scope.go:117] "RemoveContainer" containerID="c91e6dc2c4b2bcb5783878b0343351034b7d5b7649d926f7784022c51e9c99c1" Apr 28 19:38:29.182582 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:29.182556 2565 scope.go:117] "RemoveContainer" containerID="92241f672ffab2df4a764cc59b3a75612c9874d854909e661dc72e00e95b8470" Apr 28 19:38:29.182917 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:38:29.182870 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92241f672ffab2df4a764cc59b3a75612c9874d854909e661dc72e00e95b8470\": container with ID starting with 92241f672ffab2df4a764cc59b3a75612c9874d854909e661dc72e00e95b8470 not found: ID does not exist" containerID="92241f672ffab2df4a764cc59b3a75612c9874d854909e661dc72e00e95b8470" Apr 28 19:38:29.183010 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:29.182960 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92241f672ffab2df4a764cc59b3a75612c9874d854909e661dc72e00e95b8470"} err="failed to get container status \"92241f672ffab2df4a764cc59b3a75612c9874d854909e661dc72e00e95b8470\": rpc error: code = NotFound desc = could not find container \"92241f672ffab2df4a764cc59b3a75612c9874d854909e661dc72e00e95b8470\": container with ID starting with 92241f672ffab2df4a764cc59b3a75612c9874d854909e661dc72e00e95b8470 not found: ID does not exist" Apr 28 19:38:29.183010 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:29.182987 2565 scope.go:117] "RemoveContainer" containerID="c91e6dc2c4b2bcb5783878b0343351034b7d5b7649d926f7784022c51e9c99c1" Apr 28 19:38:29.183288 ip-10-0-138-119 kubenswrapper[2565]: E0428 19:38:29.183263 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c91e6dc2c4b2bcb5783878b0343351034b7d5b7649d926f7784022c51e9c99c1\": container with ID starting with c91e6dc2c4b2bcb5783878b0343351034b7d5b7649d926f7784022c51e9c99c1 not found: ID does not exist" containerID="c91e6dc2c4b2bcb5783878b0343351034b7d5b7649d926f7784022c51e9c99c1" Apr 28 19:38:29.183385 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:29.183292 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c91e6dc2c4b2bcb5783878b0343351034b7d5b7649d926f7784022c51e9c99c1"} err="failed to get container status \"c91e6dc2c4b2bcb5783878b0343351034b7d5b7649d926f7784022c51e9c99c1\": rpc error: code = NotFound desc = could not find container \"c91e6dc2c4b2bcb5783878b0343351034b7d5b7649d926f7784022c51e9c99c1\": container with ID starting with c91e6dc2c4b2bcb5783878b0343351034b7d5b7649d926f7784022c51e9c99c1 not found: ID does not exist" Apr 28 19:38:29.751166 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:29.751129 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_cde45d40-ca73-4c9c-a0c1-eed42798f767/alertmanager/0.log" Apr 28 19:38:29.785214 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:29.785186 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_cde45d40-ca73-4c9c-a0c1-eed42798f767/config-reloader/0.log" Apr 28 19:38:29.807731 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:29.807709 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_cde45d40-ca73-4c9c-a0c1-eed42798f767/kube-rbac-proxy-web/0.log" Apr 28 19:38:29.836661 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:29.836631 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_cde45d40-ca73-4c9c-a0c1-eed42798f767/kube-rbac-proxy/0.log" Apr 28 19:38:29.861616 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:29.861586 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_cde45d40-ca73-4c9c-a0c1-eed42798f767/kube-rbac-proxy-metric/0.log" Apr 28 19:38:29.893822 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:29.893793 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_cde45d40-ca73-4c9c-a0c1-eed42798f767/prom-label-proxy/0.log" Apr 28 19:38:29.925100 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:29.925065 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_cde45d40-ca73-4c9c-a0c1-eed42798f767/init-config-reloader/0.log" Apr 28 19:38:29.965710 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:29.965678 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-4h69b_8bf92c20-e551-497f-995e-ea716db91e5d/cluster-monitoring-operator/1.log" Apr 28 19:38:30.129913 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:30.129863 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-4h69b_8bf92c20-e551-497f-995e-ea716db91e5d/cluster-monitoring-operator/0.log" Apr 28 19:38:30.267311 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:30.267286 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-6vnbd_7823617f-5797-402d-ae71-da8ae44f45c6/monitoring-plugin/0.log" Apr 28 19:38:30.383658 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:30.383578 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-n8zk8_801be666-fce3-4981-98a0-f1f3b5b08af0/node-exporter/0.log" Apr 28 19:38:30.408139 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:30.408115 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-n8zk8_801be666-fce3-4981-98a0-f1f3b5b08af0/kube-rbac-proxy/0.log" Apr 28 19:38:30.430311 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:30.430289 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-n8zk8_801be666-fce3-4981-98a0-f1f3b5b08af0/init-textfile/0.log" Apr 28 19:38:30.528170 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:30.528141 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-sptmr_ea906e01-c15b-4bcc-bec1-c9d43cdd11f3/kube-rbac-proxy-main/0.log" Apr 28 19:38:30.547129 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:30.547101 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-sptmr_ea906e01-c15b-4bcc-bec1-c9d43cdd11f3/kube-rbac-proxy-self/0.log" Apr 28 19:38:30.566743 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:30.566717 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-sptmr_ea906e01-c15b-4bcc-bec1-c9d43cdd11f3/openshift-state-metrics/0.log" Apr 28 19:38:30.605302 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:30.605274 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_fa52a0b0-acb7-4163-afe1-bdacb48696a2/prometheus/0.log" Apr 28 19:38:30.622551 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:30.622519 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_fa52a0b0-acb7-4163-afe1-bdacb48696a2/config-reloader/0.log" Apr 28 19:38:30.642227 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:30.642159 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_fa52a0b0-acb7-4163-afe1-bdacb48696a2/thanos-sidecar/0.log" Apr 28 19:38:30.664467 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:30.664447 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_fa52a0b0-acb7-4163-afe1-bdacb48696a2/kube-rbac-proxy-web/0.log" Apr 28 19:38:30.685606 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:30.685578 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_fa52a0b0-acb7-4163-afe1-bdacb48696a2/kube-rbac-proxy/0.log" Apr 28 19:38:30.704430 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:30.704408 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_fa52a0b0-acb7-4163-afe1-bdacb48696a2/kube-rbac-proxy-thanos/0.log" Apr 28 19:38:30.725714 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:30.725694 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_fa52a0b0-acb7-4163-afe1-bdacb48696a2/init-config-reloader/0.log" Apr 28 19:38:32.181215 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:32.181178 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-wl476_d16fc61d-1bc8-4386-8d52-c516641eb2f9/networking-console-plugin/0.log" Apr 28 19:38:32.613462 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:32.613430 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z897l_58244bb4-99cf-41d7-91d2-e3c4ffe45e20/console-operator/2.log" Apr 28 19:38:32.621065 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:32.621036 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z897l_58244bb4-99cf-41d7-91d2-e3c4ffe45e20/console-operator/3.log" Apr 28 19:38:33.380359 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:33.380328 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-w77g9_1c325e2f-148f-4151-92f9-55ef3817ae3b/volume-data-source-validator/0.log" Apr 28 19:38:33.570722 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:33.570688 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rqj59/perf-node-gather-daemonset-545b8"] Apr 28 19:38:33.571055 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:33.571042 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="20605bf2-d540-4c49-8a0a-b74f99e33315" containerName="gather" Apr 28 19:38:33.571103 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:33.571057 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="20605bf2-d540-4c49-8a0a-b74f99e33315" containerName="gather" Apr 28 19:38:33.571103 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:33.571076 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="20605bf2-d540-4c49-8a0a-b74f99e33315" containerName="copy" Apr 28 19:38:33.571103 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:33.571081 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="20605bf2-d540-4c49-8a0a-b74f99e33315" containerName="copy" Apr 28 19:38:33.571236 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:33.571136 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="20605bf2-d540-4c49-8a0a-b74f99e33315" containerName="gather" Apr 28 19:38:33.571236 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:33.571145 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="20605bf2-d540-4c49-8a0a-b74f99e33315" containerName="copy" Apr 28 19:38:33.576248 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:33.576224 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-545b8" Apr 28 19:38:33.578385 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:33.578358 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-rqj59\"/\"openshift-service-ca.crt\"" Apr 28 19:38:33.578385 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:33.578379 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-rqj59\"/\"default-dockercfg-bqjfr\"" Apr 28 19:38:33.578560 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:33.578399 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-rqj59\"/\"kube-root-ca.crt\"" Apr 28 19:38:33.580876 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:33.580855 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rqj59/perf-node-gather-daemonset-545b8"] Apr 28 19:38:33.677331 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:33.677234 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a146efa0-0cba-4024-a996-818afb909343-proc\") pod \"perf-node-gather-daemonset-545b8\" (UID: \"a146efa0-0cba-4024-a996-818afb909343\") " pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-545b8" Apr 28 19:38:33.677331 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:33.677298 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a146efa0-0cba-4024-a996-818afb909343-podres\") pod \"perf-node-gather-daemonset-545b8\" (UID: \"a146efa0-0cba-4024-a996-818afb909343\") " pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-545b8" Apr 28 19:38:33.677538 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:33.677338 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a146efa0-0cba-4024-a996-818afb909343-lib-modules\") pod \"perf-node-gather-daemonset-545b8\" (UID: \"a146efa0-0cba-4024-a996-818afb909343\") " pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-545b8" Apr 28 19:38:33.677538 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:33.677363 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a146efa0-0cba-4024-a996-818afb909343-sys\") pod \"perf-node-gather-daemonset-545b8\" (UID: \"a146efa0-0cba-4024-a996-818afb909343\") " pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-545b8" Apr 28 19:38:33.677538 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:33.677401 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwl7n\" (UniqueName: \"kubernetes.io/projected/a146efa0-0cba-4024-a996-818afb909343-kube-api-access-dwl7n\") pod \"perf-node-gather-daemonset-545b8\" (UID: \"a146efa0-0cba-4024-a996-818afb909343\") " pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-545b8" Apr 28 19:38:33.778645 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:33.778608 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a146efa0-0cba-4024-a996-818afb909343-podres\") pod \"perf-node-gather-daemonset-545b8\" (UID: \"a146efa0-0cba-4024-a996-818afb909343\") " pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-545b8" Apr 28 19:38:33.778836 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:33.778670 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a146efa0-0cba-4024-a996-818afb909343-lib-modules\") pod \"perf-node-gather-daemonset-545b8\" (UID: \"a146efa0-0cba-4024-a996-818afb909343\") " pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-545b8" Apr 28 19:38:33.778836 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:33.778695 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a146efa0-0cba-4024-a996-818afb909343-sys\") pod \"perf-node-gather-daemonset-545b8\" (UID: \"a146efa0-0cba-4024-a996-818afb909343\") " pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-545b8" Apr 28 19:38:33.778836 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:33.778711 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dwl7n\" (UniqueName: \"kubernetes.io/projected/a146efa0-0cba-4024-a996-818afb909343-kube-api-access-dwl7n\") pod \"perf-node-gather-daemonset-545b8\" (UID: \"a146efa0-0cba-4024-a996-818afb909343\") " pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-545b8" Apr 28 19:38:33.778836 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:33.778760 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a146efa0-0cba-4024-a996-818afb909343-proc\") pod \"perf-node-gather-daemonset-545b8\" (UID: \"a146efa0-0cba-4024-a996-818afb909343\") " pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-545b8" Apr 28 19:38:33.778836 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:33.778769 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a146efa0-0cba-4024-a996-818afb909343-podres\") pod \"perf-node-gather-daemonset-545b8\" (UID: \"a146efa0-0cba-4024-a996-818afb909343\") " pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-545b8" Apr 28 19:38:33.778836 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:33.778801 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a146efa0-0cba-4024-a996-818afb909343-sys\") pod \"perf-node-gather-daemonset-545b8\" (UID: \"a146efa0-0cba-4024-a996-818afb909343\") " pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-545b8" Apr 28 19:38:33.779066 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:33.778842 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a146efa0-0cba-4024-a996-818afb909343-proc\") pod \"perf-node-gather-daemonset-545b8\" (UID: \"a146efa0-0cba-4024-a996-818afb909343\") " pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-545b8" Apr 28 19:38:33.779066 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:33.778843 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a146efa0-0cba-4024-a996-818afb909343-lib-modules\") pod \"perf-node-gather-daemonset-545b8\" (UID: \"a146efa0-0cba-4024-a996-818afb909343\") " pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-545b8" Apr 28 19:38:33.786075 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:33.786054 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwl7n\" (UniqueName: \"kubernetes.io/projected/a146efa0-0cba-4024-a996-818afb909343-kube-api-access-dwl7n\") pod \"perf-node-gather-daemonset-545b8\" (UID: \"a146efa0-0cba-4024-a996-818afb909343\") " pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-545b8" Apr 28 19:38:33.887383 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:33.887345 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-545b8" Apr 28 19:38:34.007882 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:34.007859 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rqj59/perf-node-gather-daemonset-545b8"] Apr 28 19:38:34.010950 ip-10-0-138-119 kubenswrapper[2565]: W0428 19:38:34.010919 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda146efa0_0cba_4024_a996_818afb909343.slice/crio-46e0cf6f8f48be7f34cf396c64c3f07e38a320c2aa3c29119e15c801a0076e87 WatchSource:0}: Error finding container 46e0cf6f8f48be7f34cf396c64c3f07e38a320c2aa3c29119e15c801a0076e87: Status 404 returned error can't find the container with id 46e0cf6f8f48be7f34cf396c64c3f07e38a320c2aa3c29119e15c801a0076e87 Apr 28 19:38:34.040958 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:34.040932 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-hwwt4_2692402f-8242-404a-8d92-642c2dec47fb/dns/0.log" Apr 28 19:38:34.060110 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:34.060089 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-hwwt4_2692402f-8242-404a-8d92-642c2dec47fb/kube-rbac-proxy/0.log" Apr 28 19:38:34.177547 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:34.177505 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-545b8" event={"ID":"a146efa0-0cba-4024-a996-818afb909343","Type":"ContainerStarted","Data":"2e56c250388a870069f38895082ff479bbacb36b795f4631929bdeb4ef53ba0d"} Apr 28 19:38:34.177547 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:34.177544 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-545b8" event={"ID":"a146efa0-0cba-4024-a996-818afb909343","Type":"ContainerStarted","Data":"46e0cf6f8f48be7f34cf396c64c3f07e38a320c2aa3c29119e15c801a0076e87"} Apr 28 19:38:34.177751 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:34.177632 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-545b8" Apr 28 19:38:34.191760 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:34.191671 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-545b8" podStartSLOduration=1.191658182 podStartE2EDuration="1.191658182s" podCreationTimestamp="2026-04-28 19:38:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:38:34.189810024 +0000 UTC m=+1371.589555101" watchObservedRunningTime="2026-04-28 19:38:34.191658182 +0000 UTC m=+1371.591403260" Apr 28 19:38:34.198798 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:34.198769 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-fqkh6_de956b02-f02a-4203-a743-d9efee946739/dns-node-resolver/0.log" Apr 28 19:38:34.624742 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:34.624707 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-clds8_96de079e-3abf-48db-8ecf-bcd571c3ed27/node-ca/0.log" Apr 28 19:38:35.678153 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:35.678119 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-qfmts_706dde5a-6656-4009-a585-2d9b3cbd4ecd/serve-healthcheck-canary/0.log" Apr 28 19:38:36.085129 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:36.085091 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-x7t7n_3a57a1a7-9bff-45c6-a3ab-4fdacd5a6787/insights-operator/0.log" Apr 28 19:38:36.085326 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:36.085163 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-x7t7n_3a57a1a7-9bff-45c6-a3ab-4fdacd5a6787/insights-operator/1.log" Apr 28 19:38:36.235708 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:36.235669 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-h4x8g_114be85d-4b9f-4f04-b1bb-9d17a166efa5/kube-rbac-proxy/0.log" Apr 28 19:38:36.253374 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:36.253340 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-h4x8g_114be85d-4b9f-4f04-b1bb-9d17a166efa5/exporter/0.log" Apr 28 19:38:36.271679 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:36.271635 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-h4x8g_114be85d-4b9f-4f04-b1bb-9d17a166efa5/extractor/0.log" Apr 28 19:38:39.750079 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:39.750052 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-848pf_073a3b42-7ec9-436b-8c21-a22eb0ff2c83/s3-init/0.log" Apr 28 19:38:40.191021 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:40.190994 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-545b8" Apr 28 19:38:43.769946 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:43.769916 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-m69vl_0bf50d78-7e47-47df-b7c8-dfec55922b19/migrator/0.log" Apr 28 19:38:43.789178 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:43.789150 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-m69vl_0bf50d78-7e47-47df-b7c8-dfec55922b19/graceful-termination/0.log" Apr 28 19:38:45.250999 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:45.250889 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ls4b6_f56b141f-364e-495d-9046-30f1c93dbc83/kube-multus-additional-cni-plugins/0.log" Apr 28 19:38:45.269883 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:45.269853 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ls4b6_f56b141f-364e-495d-9046-30f1c93dbc83/egress-router-binary-copy/0.log" Apr 28 19:38:45.288128 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:45.288101 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ls4b6_f56b141f-364e-495d-9046-30f1c93dbc83/cni-plugins/0.log" Apr 28 19:38:45.308373 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:45.308343 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ls4b6_f56b141f-364e-495d-9046-30f1c93dbc83/bond-cni-plugin/0.log" Apr 28 19:38:45.328305 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:45.328283 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ls4b6_f56b141f-364e-495d-9046-30f1c93dbc83/routeoverride-cni/0.log" Apr 28 19:38:45.346856 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:45.346828 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ls4b6_f56b141f-364e-495d-9046-30f1c93dbc83/whereabouts-cni-bincopy/0.log" Apr 28 19:38:45.368319 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:45.368289 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ls4b6_f56b141f-364e-495d-9046-30f1c93dbc83/whereabouts-cni/0.log" Apr 28 19:38:45.541581 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:45.541548 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pps95_023534f3-e54d-45bb-b99b-12a35302ae01/kube-multus/0.log" Apr 28 19:38:45.571019 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:45.570991 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-88gvq_09653a58-e44e-4fb5-a021-58bc08a4765f/network-metrics-daemon/0.log" Apr 28 19:38:45.589025 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:45.588989 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-88gvq_09653a58-e44e-4fb5-a021-58bc08a4765f/kube-rbac-proxy/0.log" Apr 28 19:38:46.619753 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:46.619720 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d6twb_1a521797-0035-4102-b6e7-e3757c2a296e/ovn-controller/0.log" Apr 28 19:38:46.647236 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:46.647207 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d6twb_1a521797-0035-4102-b6e7-e3757c2a296e/ovn-acl-logging/0.log" Apr 28 19:38:46.669540 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:46.669511 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d6twb_1a521797-0035-4102-b6e7-e3757c2a296e/kube-rbac-proxy-node/0.log" Apr 28 19:38:46.691292 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:46.691244 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d6twb_1a521797-0035-4102-b6e7-e3757c2a296e/kube-rbac-proxy-ovn-metrics/0.log" Apr 28 19:38:46.707545 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:46.707515 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d6twb_1a521797-0035-4102-b6e7-e3757c2a296e/northd/0.log" Apr 28 19:38:46.725940 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:46.725910 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d6twb_1a521797-0035-4102-b6e7-e3757c2a296e/nbdb/0.log" Apr 28 19:38:46.744468 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:46.744441 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d6twb_1a521797-0035-4102-b6e7-e3757c2a296e/sbdb/0.log" Apr 28 19:38:46.913077 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:46.912995 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d6twb_1a521797-0035-4102-b6e7-e3757c2a296e/ovnkube-controller/0.log" Apr 28 19:38:48.201090 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:48.201063 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-8gbf7_40262fc5-4234-4213-8739-f1ff807f34ec/network-check-target-container/0.log" Apr 28 19:38:49.203586 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:49.203559 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-tntl8_02955ab5-cc29-48ff-8727-30e4575778cb/iptables-alerter/0.log" Apr 28 19:38:49.812546 ip-10-0-138-119 kubenswrapper[2565]: I0428 19:38:49.812514 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-2cht2_d27f468f-a5ab-460e-8afc-5ff534c369dc/tuned/0.log"